[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 11389 1726854846.84883: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-ZzD executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 11389 1726854846.86336: Added group all to inventory 11389 1726854846.86339: Added group ungrouped to inventory 11389 1726854846.86344: Group all now contains ungrouped 11389 1726854846.86348: Examining possible inventory source: /tmp/network-Koj/inventory.yml 11389 1726854847.25345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 11389 1726854847.25629: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 11389 1726854847.25653: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 11389 1726854847.25855: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 11389 1726854847.25998: Loaded config def from plugin (inventory/script) 11389 1726854847.26000: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 11389 1726854847.26047: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 11389 1726854847.26211: Loaded config def from plugin (inventory/yaml) 11389 1726854847.26213: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 11389 1726854847.26631: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 11389 1726854847.28035: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 11389 1726854847.28039: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 11389 1726854847.28042: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 11389 1726854847.28048: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 11389 1726854847.28053: Loading data from /tmp/network-Koj/inventory.yml 11389 1726854847.28350: /tmp/network-Koj/inventory.yml was not parsable by auto 11389 1726854847.28611: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 11389 1726854847.28723: Loading data from /tmp/network-Koj/inventory.yml 11389 1726854847.28931: group all already in inventory 11389 1726854847.28938: set inventory_file for managed_node1 11389 1726854847.28943: set inventory_dir for managed_node1 11389 1726854847.28944: Added host managed_node1 to inventory 11389 1726854847.28946: Added host managed_node1 to group all 11389 1726854847.28947: set ansible_host for managed_node1 11389 1726854847.28947: set ansible_ssh_extra_args for managed_node1 11389 1726854847.29179: set inventory_file for managed_node2 11389 1726854847.29183: set inventory_dir for managed_node2 11389 1726854847.29184: Added host managed_node2 to inventory 11389 1726854847.29186: Added host managed_node2 to group all 11389 1726854847.29188: set ansible_host for managed_node2 11389 1726854847.29189: set ansible_ssh_extra_args for managed_node2 11389 1726854847.29192: set inventory_file for managed_node3 11389 1726854847.29195: set inventory_dir for managed_node3 11389 1726854847.29196: Added host managed_node3 to inventory 11389 1726854847.29197: Added host managed_node3 to group all 11389 1726854847.29198: set ansible_host for managed_node3 11389 1726854847.29199: set ansible_ssh_extra_args for managed_node3 11389 1726854847.29201: Reconcile groups and hosts in inventory. 11389 1726854847.29205: Group ungrouped now contains managed_node1 11389 1726854847.29207: Group ungrouped now contains managed_node2 11389 1726854847.29209: Group ungrouped now contains managed_node3 11389 1726854847.29518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 11389 1726854847.29907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 11389 1726854847.30186: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 11389 1726854847.30219: Loaded config def from plugin (vars/host_group_vars) 11389 1726854847.30222: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 11389 1726854847.30229: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 11389 1726854847.30237: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 11389 1726854847.30507: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 11389 1726854847.31590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854847.31946: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 11389 1726854847.31994: Loaded config def from plugin (connection/local) 11389 1726854847.31998: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 11389 1726854847.33627: Loaded config def from plugin (connection/paramiko_ssh) 11389 1726854847.33630: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 11389 1726854847.36069: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11389 1726854847.36151: Loaded config def from plugin (connection/psrp) 11389 1726854847.36154: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 11389 1726854847.37616: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11389 1726854847.37670: Loaded config def from plugin (connection/ssh) 11389 1726854847.37674: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 11389 1726854847.40360: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11389 1726854847.40517: Loaded config def from plugin (connection/winrm) 11389 1726854847.40520: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 11389 1726854847.40565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 11389 1726854847.40661: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 11389 1726854847.40862: Loaded config def from plugin (shell/cmd) 11389 1726854847.40864: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 11389 1726854847.40900: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 11389 1726854847.41009: Loaded config def from plugin (shell/powershell) 11389 1726854847.41011: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 11389 1726854847.41077: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 11389 1726854847.41295: Loaded config def from plugin (shell/sh) 11389 1726854847.41298: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 11389 1726854847.41332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 11389 1726854847.41475: Loaded config def from plugin (become/runas) 11389 1726854847.41477: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 11389 1726854847.41691: Loaded config def from plugin (become/su) 11389 1726854847.41694: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 11389 1726854847.41846: Loaded config def from plugin (become/sudo) 11389 1726854847.41848: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 11389 1726854847.41873: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 11389 1726854847.42124: in VariableManager get_vars() 11389 1726854847.42140: done with get_vars() 11389 1726854847.42232: trying /usr/local/lib/python3.12/site-packages/ansible/modules 11389 1726854847.45327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 11389 1726854847.45592: in VariableManager get_vars() 11389 1726854847.45598: done with get_vars() 11389 1726854847.45601: variable 'playbook_dir' from source: magic vars 11389 1726854847.45602: variable 'ansible_playbook_python' from source: magic vars 11389 1726854847.45603: variable 'ansible_config_file' from source: magic vars 11389 1726854847.45604: variable 'groups' from source: magic vars 11389 1726854847.45605: variable 'omit' from source: magic vars 11389 1726854847.45605: variable 'ansible_version' from source: magic vars 11389 1726854847.45606: variable 'ansible_check_mode' from source: magic vars 11389 1726854847.45607: variable 'ansible_diff_mode' from source: magic vars 11389 1726854847.45607: variable 'ansible_forks' from source: magic vars 11389 1726854847.45608: variable 'ansible_inventory_sources' from source: magic vars 11389 1726854847.45609: variable 'ansible_skip_tags' from source: magic vars 11389 1726854847.45610: variable 'ansible_limit' from source: magic vars 11389 1726854847.45610: variable 'ansible_run_tags' from source: magic vars 11389 1726854847.45698: variable 'ansible_verbosity' from source: magic vars 11389 1726854847.45858: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml 11389 1726854847.47016: in VariableManager get_vars() 11389 1726854847.47034: done with get_vars() 11389 1726854847.47043: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11389 1726854847.47949: in VariableManager get_vars() 11389 1726854847.47964: done with get_vars() 11389 1726854847.47975: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11389 1726854847.48085: in VariableManager get_vars() 11389 1726854847.48103: done with get_vars() 11389 1726854847.48249: in VariableManager get_vars() 11389 1726854847.48263: done with get_vars() 11389 1726854847.48275: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11389 1726854847.48428: in VariableManager get_vars() 11389 1726854847.48442: done with get_vars() 11389 1726854847.48747: in VariableManager get_vars() 11389 1726854847.48760: done with get_vars() 11389 1726854847.48765: variable 'omit' from source: magic vars 11389 1726854847.48786: variable 'omit' from source: magic vars 11389 1726854847.48828: in VariableManager get_vars() 11389 1726854847.48845: done with get_vars() 11389 1726854847.48901: in VariableManager get_vars() 11389 1726854847.48914: done with get_vars() 11389 1726854847.49045: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11389 1726854847.49595: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11389 1726854847.49717: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11389 1726854847.50378: in VariableManager get_vars() 11389 1726854847.50404: done with get_vars() 11389 1726854847.50747: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 11389 1726854847.50849: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11389 1726854847.52317: in VariableManager get_vars() 11389 1726854847.52329: done with get_vars() 11389 1726854847.52335: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11389 1726854847.52451: in VariableManager get_vars() 11389 1726854847.52465: done with get_vars() 11389 1726854847.52542: in VariableManager get_vars() 11389 1726854847.52552: done with get_vars() 11389 1726854847.52735: in VariableManager get_vars() 11389 1726854847.52746: done with get_vars() 11389 1726854847.52750: variable 'omit' from source: magic vars 11389 1726854847.52764: variable 'omit' from source: magic vars 11389 1726854847.52789: in VariableManager get_vars() 11389 1726854847.52800: done with get_vars() 11389 1726854847.52813: in VariableManager get_vars() 11389 1726854847.52823: done with get_vars() 11389 1726854847.52841: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11389 1726854847.52911: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11389 1726854847.54065: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11389 1726854847.54286: in VariableManager get_vars() 11389 1726854847.54302: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11389 1726854847.55950: in VariableManager get_vars() 11389 1726854847.55964: done with get_vars() 11389 1726854847.55971: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11389 1726854847.56286: in VariableManager get_vars() 11389 1726854847.56302: done with get_vars() 11389 1726854847.56340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 11389 1726854847.56349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 11389 1726854847.56509: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 11389 1726854847.56605: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 11389 1726854847.56607: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-ZzD/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 11389 1726854847.56630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 11389 1726854847.56646: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 11389 1726854847.56745: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 11389 1726854847.56781: Loaded config def from plugin (callback/default) 11389 1726854847.56783: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11389 1726854847.57524: Loaded config def from plugin (callback/junit) 11389 1726854847.57526: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11389 1726854847.57556: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 11389 1726854847.57595: Loaded config def from plugin (callback/minimal) 11389 1726854847.57597: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11389 1726854847.57625: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11389 1726854847.57662: Loaded config def from plugin (callback/tree) 11389 1726854847.57664: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 11389 1726854847.57744: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 11389 1726854847.57746: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-ZzD/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_nm.yml **************************************************** 2 plays in /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 11389 1726854847.57764: in VariableManager get_vars() 11389 1726854847.57772: done with get_vars() 11389 1726854847.57776: in VariableManager get_vars() 11389 1726854847.57781: done with get_vars() 11389 1726854847.57784: variable 'omit' from source: magic vars 11389 1726854847.57807: in VariableManager get_vars() 11389 1726854847.57816: done with get_vars() 11389 1726854847.57831: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond.yml' with nm as provider] ************* 11389 1726854847.58189: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 11389 1726854847.58237: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 11389 1726854847.58264: getting the remaining hosts for this loop 11389 1726854847.58266: done getting the remaining hosts for this loop 11389 1726854847.58268: getting the next task for host managed_node3 11389 1726854847.58271: done getting next task for host managed_node3 11389 1726854847.58273: ^ task is: TASK: Gathering Facts 11389 1726854847.58274: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854847.58276: getting variables 11389 1726854847.58276: in VariableManager get_vars() 11389 1726854847.58283: Calling all_inventory to load vars for managed_node3 11389 1726854847.58284: Calling groups_inventory to load vars for managed_node3 11389 1726854847.58286: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854847.58296: Calling all_plugins_play to load vars for managed_node3 11389 1726854847.58303: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854847.58305: Calling groups_plugins_play to load vars for managed_node3 11389 1726854847.58325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854847.58358: done with get_vars() 11389 1726854847.58363: done getting variables 11389 1726854847.58410: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 Friday 20 September 2024 13:54:07 -0400 (0:00:00.007) 0:00:00.007 ****** 11389 1726854847.58424: entering _queue_task() for managed_node3/gather_facts 11389 1726854847.58425: Creating lock for gather_facts 11389 1726854847.58709: worker is 1 (out of 1 available) 11389 1726854847.58719: exiting _queue_task() for managed_node3/gather_facts 11389 1726854847.58732: done queuing things up, now waiting for results queue to drain 11389 1726854847.58734: waiting for pending results... 11389 1726854847.58862: running TaskExecutor() for managed_node3/TASK: Gathering Facts 11389 1726854847.58919: in run() - task 0affcc66-ac2b-deb8-c119-0000000000cc 11389 1726854847.58932: variable 'ansible_search_path' from source: unknown 11389 1726854847.58961: calling self._execute() 11389 1726854847.59011: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854847.59015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854847.59023: variable 'omit' from source: magic vars 11389 1726854847.59143: variable 'omit' from source: magic vars 11389 1726854847.59164: variable 'omit' from source: magic vars 11389 1726854847.59212: variable 'omit' from source: magic vars 11389 1726854847.59242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854847.59272: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854847.59294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854847.59340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854847.59344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854847.59368: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854847.59372: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854847.59375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854847.59530: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854847.59533: Set connection var ansible_timeout to 10 11389 1726854847.59535: Set connection var ansible_connection to ssh 11389 1726854847.59538: Set connection var ansible_shell_type to sh 11389 1726854847.59540: Set connection var ansible_pipelining to False 11389 1726854847.59542: Set connection var ansible_shell_executable to /bin/sh 11389 1726854847.59545: variable 'ansible_shell_executable' from source: unknown 11389 1726854847.59549: variable 'ansible_connection' from source: unknown 11389 1726854847.59551: variable 'ansible_module_compression' from source: unknown 11389 1726854847.59553: variable 'ansible_shell_type' from source: unknown 11389 1726854847.59556: variable 'ansible_shell_executable' from source: unknown 11389 1726854847.59558: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854847.59560: variable 'ansible_pipelining' from source: unknown 11389 1726854847.59562: variable 'ansible_timeout' from source: unknown 11389 1726854847.59565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854847.59759: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854847.59763: variable 'omit' from source: magic vars 11389 1726854847.59765: starting attempt loop 11389 1726854847.59770: running the handler 11389 1726854847.59773: variable 'ansible_facts' from source: unknown 11389 1726854847.59785: _low_level_execute_command(): starting 11389 1726854847.59794: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854847.60510: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854847.60553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854847.60564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854847.60584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854847.60701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854847.62389: stdout chunk (state=3): >>>/root <<< 11389 1726854847.62584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854847.62593: stderr chunk (state=3): >>><<< 11389 1726854847.62596: stdout chunk (state=3): >>><<< 11389 1726854847.62699: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854847.62702: _low_level_execute_command(): starting 11389 1726854847.62725: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854847.6262517-11461-187239997535993 `" && echo ansible-tmp-1726854847.6262517-11461-187239997535993="` echo /root/.ansible/tmp/ansible-tmp-1726854847.6262517-11461-187239997535993 `" ) && sleep 0' 11389 1726854847.63342: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854847.63402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854847.63405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854847.63491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854847.65422: stdout chunk (state=3): >>>ansible-tmp-1726854847.6262517-11461-187239997535993=/root/.ansible/tmp/ansible-tmp-1726854847.6262517-11461-187239997535993 <<< 11389 1726854847.65589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854847.65619: stderr chunk (state=3): >>><<< 11389 1726854847.65623: stdout chunk (state=3): >>><<< 11389 1726854847.65640: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854847.6262517-11461-187239997535993=/root/.ansible/tmp/ansible-tmp-1726854847.6262517-11461-187239997535993 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854847.65690: variable 'ansible_module_compression' from source: unknown 11389 1726854847.65754: ANSIBALLZ: Using generic lock for ansible.legacy.setup 11389 1726854847.65758: ANSIBALLZ: Acquiring lock 11389 1726854847.65761: ANSIBALLZ: Lock acquired: 140464425326096 11389 1726854847.65762: ANSIBALLZ: Creating module 11389 1726854847.94026: ANSIBALLZ: Writing module into payload 11389 1726854847.94078: ANSIBALLZ: Writing module 11389 1726854847.94129: ANSIBALLZ: Renaming module 11389 1726854847.94139: ANSIBALLZ: Done creating module 11389 1726854847.94164: variable 'ansible_facts' from source: unknown 11389 1726854847.94185: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854847.94210: _low_level_execute_command(): starting 11389 1726854847.94220: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 11389 1726854847.94977: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854847.94999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854847.95091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854847.95140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854847.95172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854847.95237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854847.96965: stdout chunk (state=3): >>>PLATFORM <<< 11389 1726854847.97073: stdout chunk (state=3): >>>Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 11389 1726854847.97202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854847.97227: stderr chunk (state=3): >>><<< 11389 1726854847.97231: stdout chunk (state=3): >>><<< 11389 1726854847.97243: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854847.97260 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 11389 1726854847.97301: _low_level_execute_command(): starting 11389 1726854847.97304: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 11389 1726854847.97437: Sending initial data 11389 1726854847.97440: Sent initial data (1181 bytes) 11389 1726854847.98020: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854847.98023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854847.98026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854847.98125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854847.98168: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854847.98269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854848.01673: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 11389 1726854848.02096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854848.02181: stderr chunk (state=3): >>><<< 11389 1726854848.02184: stdout chunk (state=3): >>><<< 11389 1726854848.02193: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854848.02202: variable 'ansible_facts' from source: unknown 11389 1726854848.02205: variable 'ansible_facts' from source: unknown 11389 1726854848.02214: variable 'ansible_module_compression' from source: unknown 11389 1726854848.02244: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11389 1726854848.02267: variable 'ansible_facts' from source: unknown 11389 1726854848.02364: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854847.6262517-11461-187239997535993/AnsiballZ_setup.py 11389 1726854848.02471: Sending initial data 11389 1726854848.02475: Sent initial data (154 bytes) 11389 1726854848.02919: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854848.02922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854848.02925: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 11389 1726854848.02927: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854848.02929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854848.02982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854848.02986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854848.02993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854848.03056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854848.04924: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854848.05006: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854848.05010: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854847.6262517-11461-187239997535993/AnsiballZ_setup.py" <<< 11389 1726854848.05013: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmpbs5ewju7 /root/.ansible/tmp/ansible-tmp-1726854847.6262517-11461-187239997535993/AnsiballZ_setup.py <<< 11389 1726854848.05147: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmpbs5ewju7" to remote "/root/.ansible/tmp/ansible-tmp-1726854847.6262517-11461-187239997535993/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854847.6262517-11461-187239997535993/AnsiballZ_setup.py" <<< 11389 1726854848.06664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854848.06724: stderr chunk (state=3): >>><<< 11389 1726854848.06741: stdout chunk (state=3): >>><<< 11389 1726854848.06875: done transferring module to remote 11389 1726854848.06878: _low_level_execute_command(): starting 11389 1726854848.06880: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854847.6262517-11461-187239997535993/ /root/.ansible/tmp/ansible-tmp-1726854847.6262517-11461-187239997535993/AnsiballZ_setup.py && sleep 0' 11389 1726854848.07442: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854848.07456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854848.07474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854848.07500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854848.07595: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854848.07647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854848.07709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854848.09499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854848.09501: stdout chunk (state=3): >>><<< 11389 1726854848.09503: stderr chunk (state=3): >>><<< 11389 1726854848.09512: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854848.09521: _low_level_execute_command(): starting 11389 1726854848.09529: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854847.6262517-11461-187239997535993/AnsiballZ_setup.py && sleep 0' 11389 1726854848.09938: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854848.09941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854848.09944: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854848.09946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854848.10000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854848.10003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854848.10072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854848.12199: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 11389 1726854848.12231: stdout chunk (state=3): >>>import _imp # builtin <<< 11389 1726854848.12262: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 11389 1726854848.12268: stdout chunk (state=3): >>>import '_weakref' # <<< 11389 1726854848.12328: stdout chunk (state=3): >>>import '_io' # <<< 11389 1726854848.12338: stdout chunk (state=3): >>>import 'marshal' # <<< 11389 1726854848.12365: stdout chunk (state=3): >>>import 'posix' # <<< 11389 1726854848.12398: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 11389 1726854848.12406: stdout chunk (state=3): >>># installing zipimport hook <<< 11389 1726854848.12429: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 11389 1726854848.12436: stdout chunk (state=3): >>> # installed zipimport hook <<< 11389 1726854848.12485: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 11389 1726854848.12491: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854848.12502: stdout chunk (state=3): >>>import '_codecs' # <<< 11389 1726854848.12529: stdout chunk (state=3): >>>import 'codecs' # <<< 11389 1726854848.12556: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11389 1726854848.12581: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 11389 1726854848.12594: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfe184d0> <<< 11389 1726854848.12600: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfde7b30> <<< 11389 1726854848.12618: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 11389 1726854848.12633: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfe1aa50> <<< 11389 1726854848.12649: stdout chunk (state=3): >>>import '_signal' # <<< 11389 1726854848.12681: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 11389 1726854848.12697: stdout chunk (state=3): >>>import 'io' # <<< 11389 1726854848.12726: stdout chunk (state=3): >>>import '_stat' # <<< 11389 1726854848.12731: stdout chunk (state=3): >>>import 'stat' # <<< 11389 1726854848.12807: stdout chunk (state=3): >>>import '_collections_abc' # <<< 11389 1726854848.12839: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 11389 1726854848.12868: stdout chunk (state=3): >>>import 'os' # <<< 11389 1726854848.12906: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 11389 1726854848.12909: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 11389 1726854848.12912: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 11389 1726854848.12926: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 11389 1726854848.12950: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 11389 1726854848.12957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 11389 1726854848.12975: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc09130> <<< 11389 1726854848.13029: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 11389 1726854848.13040: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854848.13046: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc09fa0> <<< 11389 1726854848.13070: stdout chunk (state=3): >>>import 'site' # <<< 11389 1726854848.13102: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11389 1726854848.13476: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11389 1726854848.13483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 11389 1726854848.13509: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 11389 1726854848.13512: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854848.13537: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 11389 1726854848.13575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11389 1726854848.13586: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 11389 1726854848.13625: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 11389 1726854848.13632: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc47da0> <<< 11389 1726854848.13648: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 11389 1726854848.13663: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 11389 1726854848.13684: stdout chunk (state=3): >>>import '_operator' # <<< 11389 1726854848.13693: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc47fb0> <<< 11389 1726854848.13705: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11389 1726854848.13736: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 11389 1726854848.13756: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 11389 1726854848.13805: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854848.13814: stdout chunk (state=3): >>>import 'itertools' # <<< 11389 1726854848.13848: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc7f770> <<< 11389 1726854848.13878: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 11389 1726854848.13891: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc7fe00> <<< 11389 1726854848.13899: stdout chunk (state=3): >>>import '_collections' # <<< 11389 1726854848.13944: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc5fa40> <<< 11389 1726854848.13961: stdout chunk (state=3): >>>import '_functools' # <<< 11389 1726854848.13982: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc5d160> <<< 11389 1726854848.14085: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc44f50> <<< 11389 1726854848.14102: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 11389 1726854848.14122: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 11389 1726854848.14146: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 11389 1726854848.14165: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 11389 1726854848.14197: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 11389 1726854848.14227: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc9f6b0> <<< 11389 1726854848.14241: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc9e2d0> <<< 11389 1726854848.14302: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc5e030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc9cb60> <<< 11389 1726854848.14345: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcd46b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc441d0> <<< 11389 1726854848.14403: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 11389 1726854848.14410: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854848.14434: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfcd4b60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcd4a10> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfcd4dd0> <<< 11389 1726854848.14458: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc42cf0> <<< 11389 1726854848.14486: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854848.14521: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 11389 1726854848.14547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcd54c0><<< 11389 1726854848.14564: stdout chunk (state=3): >>> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcd5190> import 'importlib.machinery' # <<< 11389 1726854848.14601: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 11389 1726854848.14605: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcd63c0> <<< 11389 1726854848.14642: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 11389 1726854848.14645: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11389 1726854848.14686: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 11389 1726854848.14715: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcf05c0> import 'errno' # <<< 11389 1726854848.14752: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854848.14776: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfcf1d00> <<< 11389 1726854848.14807: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 11389 1726854848.14818: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcf2ba0> <<< 11389 1726854848.14868: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfcf3200> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcf20f0> <<< 11389 1726854848.14892: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 11389 1726854848.14908: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11389 1726854848.14941: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfcf3c80> <<< 11389 1726854848.14974: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcf33b0> <<< 11389 1726854848.15002: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcd6330> <<< 11389 1726854848.15013: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 11389 1726854848.15044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 11389 1726854848.15072: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 11389 1726854848.15104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11389 1726854848.15128: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf9ebbf0> <<< 11389 1726854848.15155: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 11389 1726854848.15205: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfa146e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfa14440> <<< 11389 1726854848.15208: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfa14710> <<< 11389 1726854848.15230: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11389 1726854848.15293: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854848.15423: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfa14fe0> <<< 11389 1726854848.15532: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854848.15572: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfa159d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfa14890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf9e9d90> <<< 11389 1726854848.15577: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11389 1726854848.15634: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 11389 1726854848.15637: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfa16db0> <<< 11389 1726854848.15677: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfa15af0> <<< 11389 1726854848.15693: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcd6ae0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11389 1726854848.15771: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854848.15808: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 11389 1726854848.15811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 11389 1726854848.15827: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfa3f110> <<< 11389 1726854848.15919: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 11389 1726854848.15923: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854848.15934: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 11389 1726854848.15976: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfa63470> <<< 11389 1726854848.16005: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11389 1726854848.16035: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11389 1726854848.16099: stdout chunk (state=3): >>>import 'ntpath' # <<< 11389 1726854848.16138: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfac4290> <<< 11389 1726854848.16154: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11389 1726854848.16182: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11389 1726854848.16216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11389 1726854848.16302: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfac69f0> <<< 11389 1726854848.16375: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfac43b0> <<< 11389 1726854848.16422: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfa91280> <<< 11389 1726854848.16469: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf3253d0> <<< 11389 1726854848.16473: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfa62270> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfa17ce0> <<< 11389 1726854848.16624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11389 1726854848.16647: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8acfa62870> <<< 11389 1726854848.16982: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_sdhevthf/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 11389 1726854848.17098: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.17143: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 11389 1726854848.17146: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11389 1726854848.17176: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11389 1726854848.17273: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 11389 1726854848.17284: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf38b1a0> import '_typing' # <<< 11389 1726854848.17486: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf36a090> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf3691f0> # zipimport: zlib available <<< 11389 1726854848.17519: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available <<< 11389 1726854848.17558: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.17578: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 11389 1726854848.17586: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.18945: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.20096: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 11389 1726854848.20126: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf388e90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 11389 1726854848.20159: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 11389 1726854848.20179: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 11389 1726854848.20216: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854848.20239: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf3baa80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf3ba810> <<< 11389 1726854848.20274: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf3ba150> <<< 11389 1726854848.20285: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 11389 1726854848.20332: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf3ba8a0> <<< 11389 1726854848.20372: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf38be30> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf3bb7d0> <<< 11389 1726854848.20404: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf3bb9e0> <<< 11389 1726854848.20415: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11389 1726854848.20474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 11389 1726854848.20516: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf3bbf20> import 'pwd' # <<< 11389 1726854848.20550: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11389 1726854848.20566: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11389 1726854848.20606: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf225ca0> <<< 11389 1726854848.20636: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854848.20652: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf2278c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 11389 1726854848.20676: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 11389 1726854848.20727: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf2282c0> <<< 11389 1726854848.20731: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11389 1726854848.20768: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 11389 1726854848.20771: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf2291c0> <<< 11389 1726854848.20791: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 11389 1726854848.20835: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 11389 1726854848.20848: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 11389 1726854848.20890: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf22bec0> <<< 11389 1726854848.20933: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfc42de0> <<< 11389 1726854848.20969: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf22a1b0> <<< 11389 1726854848.20986: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 11389 1726854848.21016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 11389 1726854848.21043: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11389 1726854848.21175: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 11389 1726854848.21178: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 11389 1726854848.21201: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf233e30> import '_tokenize' # <<< 11389 1726854848.21268: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf232900> <<< 11389 1726854848.21283: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf232660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11389 1726854848.21362: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf232bd0> <<< 11389 1726854848.21414: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf22a6c0> <<< 11389 1726854848.21419: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf277fe0> <<< 11389 1726854848.21467: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf2781a0> <<< 11389 1726854848.21481: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 11389 1726854848.21540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 11389 1726854848.21544: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854848.21561: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf279c10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf2799d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11389 1726854848.21584: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 11389 1726854848.21651: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854848.21654: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf27c170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf27a2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11389 1726854848.21727: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854848.21745: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 11389 1726854848.21747: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 11389 1726854848.21769: stdout chunk (state=3): >>>import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf27f950> <<< 11389 1726854848.21896: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf27c320> <<< 11389 1726854848.21970: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf280a40> <<< 11389 1726854848.21986: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf280b90> <<< 11389 1726854848.22030: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf280ad0> <<< 11389 1726854848.22063: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf278350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 11389 1726854848.22098: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 11389 1726854848.22132: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854848.22147: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf108200> <<< 11389 1726854848.22303: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf109460> <<< 11389 1726854848.22356: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf282990> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf283d40> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf282600> <<< 11389 1726854848.22359: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.22389: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 11389 1726854848.22475: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.22571: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.22575: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 11389 1726854848.22619: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 11389 1726854848.22637: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.22744: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.22861: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.23398: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.23938: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 11389 1726854848.23960: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 11389 1726854848.23990: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854848.24036: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf111640> <<< 11389 1726854848.24129: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 11389 1726854848.24139: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf112420> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf1096a0> <<< 11389 1726854848.24198: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 11389 1726854848.24233: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.24250: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 11389 1726854848.24393: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.24561: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 11389 1726854848.24564: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf112330> # zipimport: zlib available <<< 11389 1726854848.25019: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.25477: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.25517: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.25603: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 11389 1726854848.25614: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.25643: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.25677: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 11389 1726854848.25755: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.25832: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 11389 1726854848.25865: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11389 1726854848.25881: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 11389 1726854848.25913: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.25965: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 11389 1726854848.25968: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.26195: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.26407: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11389 1726854848.26465: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 11389 1726854848.26477: stdout chunk (state=3): >>>import '_ast' # <<< 11389 1726854848.26550: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf1134d0> <<< 11389 1726854848.26561: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.26629: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.26703: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 11389 1726854848.26740: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.26776: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.26819: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 11389 1726854848.26821: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.26866: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.26906: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.26967: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.27039: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11389 1726854848.27074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854848.27148: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf11e090> <<< 11389 1726854848.27220: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf119730> <<< 11389 1726854848.27235: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 11389 1726854848.27297: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.27350: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.27378: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.27420: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854848.27454: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 11389 1726854848.27468: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 11389 1726854848.27489: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 11389 1726854848.27562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 11389 1726854848.27565: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 11389 1726854848.27583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11389 1726854848.27629: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf2069c0> <<< 11389 1726854848.27679: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf2fe690> <<< 11389 1726854848.27761: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf11e1e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf11de20> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 11389 1726854848.27781: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.27815: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.27834: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 11389 1726854848.27890: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 11389 1726854848.27892: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.27913: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 11389 1726854848.27972: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.28049: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11389 1726854848.28072: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.28114: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.28172: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.28194: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.28273: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 11389 1726854848.28330: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11389 1726854848.28382: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.28412: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.28445: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 11389 1726854848.28490: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.28616: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.28802: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.28892: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.28920: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 11389 1726854848.29062: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf1b20c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 11389 1726854848.29120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 11389 1726854848.29153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acee17f80> <<< 11389 1726854848.29276: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acee1c590> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf19aea0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf1b2c60> <<< 11389 1726854848.29396: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf1b0770> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf1b0b00> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 11389 1726854848.29437: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 11389 1726854848.29441: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 11389 1726854848.29457: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 11389 1726854848.29480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 11389 1726854848.29501: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acee1f290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acee1eb40> <<< 11389 1726854848.29544: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acee1ed20> <<< 11389 1726854848.29570: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acee1df70> <<< 11389 1726854848.29574: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 11389 1726854848.29708: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 11389 1726854848.29714: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acee1f470> <<< 11389 1726854848.29728: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 11389 1726854848.29760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 11389 1726854848.29796: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acee81fa0> <<< 11389 1726854848.29825: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acee1ff80> <<< 11389 1726854848.29859: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf1b04a0> import 'ansible.module_utils.facts.timeout' # <<< 11389 1726854848.29895: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 11389 1726854848.29911: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.29924: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 11389 1726854848.30060: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.30064: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 11389 1726854848.30069: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.30125: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.30196: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 11389 1726854848.30211: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 11389 1726854848.30232: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.30250: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.30282: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 11389 1726854848.30302: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.30340: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.30392: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 11389 1726854848.30410: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.30445: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.30483: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 11389 1726854848.30499: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.30555: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.30612: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.30671: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.30732: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 11389 1726854848.30753: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 11389 1726854848.31324: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.31652: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 11389 1726854848.31677: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.31743: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.31762: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.31801: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.31823: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 11389 1726854848.31848: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 11389 1726854848.31908: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.31931: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 11389 1726854848.32004: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.32225: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 11389 1726854848.32229: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 11389 1726854848.32276: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.32367: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 11389 1726854848.32453: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acee83830> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 11389 1726854848.32558: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acee82a20> <<< 11389 1726854848.32582: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 11389 1726854848.32643: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.32708: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 11389 1726854848.32807: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.32912: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 11389 1726854848.32985: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.33050: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 11389 1726854848.33069: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.33111: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.33148: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 11389 1726854848.33200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 11389 1726854848.33273: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854848.33327: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aceebe1b0> <<< 11389 1726854848.33549: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aceeadf70> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 11389 1726854848.33591: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.33657: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 11389 1726854848.33751: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.33818: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.33933: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.34106: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 11389 1726854848.34134: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11389 1726854848.34179: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 11389 1726854848.34198: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.34231: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.34497: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aceed1790> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aceeaf170> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 11389 1726854848.34528: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 11389 1726854848.34815: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.34860: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 11389 1726854848.34894: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.34993: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.35061: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.35095: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 11389 1726854848.35170: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11389 1726854848.35284: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.35434: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 11389 1726854848.35798: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 11389 1726854848.35802: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.35804: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.36318: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.36838: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 11389 1726854848.36841: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 11389 1726854848.36844: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.36939: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.37058: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 11389 1726854848.37075: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.37155: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.37260: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 11389 1726854848.37272: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.37414: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.37594: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 11389 1726854848.37598: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 11389 1726854848.37630: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.37658: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.37708: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 11389 1726854848.37719: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.37822: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.37915: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.38112: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.38319: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 11389 1726854848.38328: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.38395: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.38408: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 11389 1726854848.38436: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.38463: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.38475: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 11389 1726854848.38479: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.38556: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.38617: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 11389 1726854848.38662: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.38665: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.38697: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 11389 1726854848.38758: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.38808: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 11389 1726854848.38818: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.38874: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.38932: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 11389 1726854848.38939: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.39205: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.39497: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 11389 1726854848.39504: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.39556: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.39619: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 11389 1726854848.39621: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.39651: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.39684: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 11389 1726854848.39690: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.39725: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.39753: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 11389 1726854848.39774: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.39808: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.39838: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 11389 1726854848.39842: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.39931: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.40043: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 11389 1726854848.40092: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.40122: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.40166: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 11389 1726854848.40204: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11389 1726854848.40344: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11389 1726854848.40373: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.40465: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 11389 1726854848.40471: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 11389 1726854848.40584: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11389 1726854848.40618: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 11389 1726854848.40776: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.40984: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 11389 1726854848.41016: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.41107: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 11389 1726854848.41173: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.41228: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 11389 1726854848.41296: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.41361: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 11389 1726854848.41377: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.41538: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.41593: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 11389 1726854848.41630: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854848.42185: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 11389 1726854848.42213: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 11389 1726854848.42255: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 11389 1726854848.42273: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acec6a4b0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acec69100> <<< 11389 1726854848.42321: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acec60170> <<< 11389 1726854848.57428: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 11389 1726854848.57478: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acecb0ec0> <<< 11389 1726854848.57496: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 11389 1726854848.57551: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acecb1ca0> <<< 11389 1726854848.57563: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854848.57622: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 11389 1726854848.57626: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acecf83e0> <<< 11389 1726854848.57641: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acecb3f80> <<< 11389 1726854848.57872: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 11389 1726854848.78121: stdout chunk (state=3): >>> <<< 11389 1726854848.78144: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-244", "ansible_nodename": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2bc2acdd478a7423346e83b59fcdca", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCehZcRIiuho2g2VlWk6lYiySbVbpZPmaChpZJeIDeRDUTa1RCEnhGqH+DqSOr9XQgt/gHETb6HW1jwsrG3TM2y4UJqdcp3Vzdn2ceWAQSdC2hYxEDR7vD44mLY2TejKxXaN9WKywAwIRXdqXE3GJHR51KQe4kLYkzvhwwLpGlQwdZ5Tr4DTu6gsb5lUwzcvzk7gErzO/v2T4+jlj/bt7UDFkiASBXxhi+oZQYQAxOwOgM1BAGpl8GWX5nd5MFlFvztq2uV8Mra3ANc/7CgBxQOT9iCGpBsUXJ9UG35hNjY0xC5qa1XCoQbp0sbNhS4C+uvHspFzAqFLBx69tc5dbYXanTxy+MCFe9g8WNJpNFK9UNYAWwDdUIfkDHf3HqZtqnMO8FBVbSS6+K2NOdt9ZrQP4d9jVZxS7o3E37g6YSmVV+6OJZ8oTiSVe1wx3uYYpFCPbdbdOGkXpvU0oaDInwYl5PzhM1yjiWMsSqSvYHkCUAzsAv0Ws/L0t5uXSgTbCU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAAcVe0oQCsdWka9CinqxODLfzoA5WUkIscuWGu+0Pb9loUC4MBgDClPe5T0oztCcT0NSKld23Y2UFOZyAkaU+U=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICjU01xmt/yoMRnNQ5IgfXwC8CabJN267FXBGFtFm2PC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "54", "second": "08", "epoch": "1726854848", "epoch_int": "1726854848", "date": "2024-09-20", "time": "13:54:08", "iso8601_micro": "2024-09-20T17:54:08.431299Z", "iso8601": "2024-09-20T17:54:08Z", "iso8601_basic": "20240920T135408431299", "iso8601_basic_short": "20240920T135408", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.533203125, "5m": 0.2255859375, "15m": 0.119140625}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:88:11:da:7f:a3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.244", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1088:11ff:feda:7fa3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.244", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:88:11:da:7f:a3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.244"], "ansible_all_ipv6_addresses": ["fe80::1088:11ff:feda:7fa3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.244", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1088:11ff:feda:7fa3"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 52416 10.31.9.244 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 52416 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2986, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 545, "free": 2986}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2bc2ac-dd47-8a74-2334-6e83b59fcdca", "a<<< 11389 1726854848.78194: stdout chunk (state=3): >>>nsible_product_uuid": "ec2bc2ac-dd47-8a74-2334-6e83b59fcdca", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 619, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805670400, "block_size": 4096, "block_total": 65519099, "block_available": 63917400, "block_used": 1601699, "inode_total": 131070960, "inode_available": 131029139, "inode_used": 41821, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11389 1726854848.78809: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections <<< 11389 1726854848.78837: stdout chunk (state=3): >>># cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 <<< 11389 1726854848.78897: stdout chunk (state=3): >>># cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing <<< 11389 1726854848.79005: stdout chunk (state=3): >>># cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast <<< 11389 1726854848.79015: stdout chunk (state=3): >>># cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux <<< 11389 1726854848.79116: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios <<< 11389 1726854848.79122: stdout chunk (state=3): >>># cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix <<< 11389 1726854848.79151: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux <<< 11389 1726854848.79181: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 11389 1726854848.79496: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11389 1726854848.79533: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 11389 1726854848.79546: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 11389 1726854848.79574: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 11389 1726854848.79633: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport <<< 11389 1726854848.79650: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 11389 1726854848.79699: stdout chunk (state=3): >>># destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select <<< 11389 1726854848.79702: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess <<< 11389 1726854848.79756: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 11389 1726854848.79773: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 11389 1726854848.79776: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 11389 1726854848.79848: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool <<< 11389 1726854848.79852: stdout chunk (state=3): >>># destroy signal # destroy pickle # destroy _compat_pickle <<< 11389 1726854848.79915: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 11389 1726854848.79918: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess <<< 11389 1726854848.79921: stdout chunk (state=3): >>># destroy base64 <<< 11389 1726854848.79975: stdout chunk (state=3): >>># destroy _ssl <<< 11389 1726854848.79977: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 11389 1726854848.80005: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 11389 1726854848.80043: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing <<< 11389 1726854848.80046: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection <<< 11389 1726854848.80113: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 11389 1726854848.80158: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 11389 1726854848.80200: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 11389 1726854848.80222: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools <<< 11389 1726854848.80289: stdout chunk (state=3): >>># cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 11389 1726854848.80308: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11389 1726854848.80439: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 11389 1726854848.80497: stdout chunk (state=3): >>># destroy _collections <<< 11389 1726854848.80500: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 11389 1726854848.80538: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 11389 1726854848.80575: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 11389 1726854848.80593: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 11389 1726854848.80611: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11389 1726854848.80710: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback <<< 11389 1726854848.80746: stdout chunk (state=3): >>># destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 11389 1726854848.80812: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc <<< 11389 1726854848.80815: stdout chunk (state=3): >>># destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 11389 1726854848.80840: stdout chunk (state=3): >>># clear sys.audit hooks <<< 11389 1726854848.81366: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854848.81369: stdout chunk (state=3): >>><<< 11389 1726854848.81372: stderr chunk (state=3): >>><<< 11389 1726854848.81709: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfe184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfde7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfe1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc09130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc09fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc47da0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc47fb0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc7f770> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc7fe00> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc5fa40> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc5d160> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc44f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc9f6b0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc9e2d0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc5e030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc9cb60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcd46b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc441d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfcd4b60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcd4a10> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfcd4dd0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfc42cf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcd54c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcd5190> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcd63c0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcf05c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfcf1d00> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcf2ba0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfcf3200> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcf20f0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfcf3c80> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcf33b0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcd6330> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf9ebbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfa146e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfa14440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfa14710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfa14fe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfa159d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfa14890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf9e9d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfa16db0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfa15af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfcd6ae0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfa3f110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfa63470> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfac4290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfac69f0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfac43b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfa91280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf3253d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfa62270> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acfa17ce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8acfa62870> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_sdhevthf/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf38b1a0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf36a090> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf3691f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf388e90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf3baa80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf3ba810> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf3ba150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf3ba8a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf38be30> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf3bb7d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf3bb9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf3bbf20> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf225ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf2278c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf2282c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf2291c0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf22bec0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acfc42de0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf22a1b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf233e30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf232900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf232660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf232bd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf22a6c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf277fe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf2781a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf279c10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf2799d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf27c170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf27a2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf27f950> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf27c320> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf280a40> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf280b90> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf280ad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf278350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf108200> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf109460> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf282990> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf283d40> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf282600> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf111640> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf112420> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf1096a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf112330> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf1134d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acf11e090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf119730> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf2069c0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf2fe690> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf11e1e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf11de20> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf1b20c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acee17f80> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acee1c590> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf19aea0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf1b2c60> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf1b0770> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf1b0b00> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acee1f290> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acee1eb40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acee1ed20> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acee1df70> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acee1f470> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acee81fa0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acee1ff80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acf1b04a0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acee83830> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acee82a20> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aceebe1b0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aceeadf70> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8aceed1790> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8aceeaf170> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8acec6a4b0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acec69100> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acec60170> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acecb0ec0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acecb1ca0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acecf83e0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8acecb3f80> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-244", "ansible_nodename": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2bc2acdd478a7423346e83b59fcdca", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCehZcRIiuho2g2VlWk6lYiySbVbpZPmaChpZJeIDeRDUTa1RCEnhGqH+DqSOr9XQgt/gHETb6HW1jwsrG3TM2y4UJqdcp3Vzdn2ceWAQSdC2hYxEDR7vD44mLY2TejKxXaN9WKywAwIRXdqXE3GJHR51KQe4kLYkzvhwwLpGlQwdZ5Tr4DTu6gsb5lUwzcvzk7gErzO/v2T4+jlj/bt7UDFkiASBXxhi+oZQYQAxOwOgM1BAGpl8GWX5nd5MFlFvztq2uV8Mra3ANc/7CgBxQOT9iCGpBsUXJ9UG35hNjY0xC5qa1XCoQbp0sbNhS4C+uvHspFzAqFLBx69tc5dbYXanTxy+MCFe9g8WNJpNFK9UNYAWwDdUIfkDHf3HqZtqnMO8FBVbSS6+K2NOdt9ZrQP4d9jVZxS7o3E37g6YSmVV+6OJZ8oTiSVe1wx3uYYpFCPbdbdOGkXpvU0oaDInwYl5PzhM1yjiWMsSqSvYHkCUAzsAv0Ws/L0t5uXSgTbCU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAAcVe0oQCsdWka9CinqxODLfzoA5WUkIscuWGu+0Pb9loUC4MBgDClPe5T0oztCcT0NSKld23Y2UFOZyAkaU+U=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICjU01xmt/yoMRnNQ5IgfXwC8CabJN267FXBGFtFm2PC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "54", "second": "08", "epoch": "1726854848", "epoch_int": "1726854848", "date": "2024-09-20", "time": "13:54:08", "iso8601_micro": "2024-09-20T17:54:08.431299Z", "iso8601": "2024-09-20T17:54:08Z", "iso8601_basic": "20240920T135408431299", "iso8601_basic_short": "20240920T135408", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.533203125, "5m": 0.2255859375, "15m": 0.119140625}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:88:11:da:7f:a3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.244", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1088:11ff:feda:7fa3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.244", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:88:11:da:7f:a3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.244"], "ansible_all_ipv6_addresses": ["fe80::1088:11ff:feda:7fa3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.244", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1088:11ff:feda:7fa3"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 52416 10.31.9.244 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 52416 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2986, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 545, "free": 2986}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2bc2ac-dd47-8a74-2334-6e83b59fcdca", "ansible_product_uuid": "ec2bc2ac-dd47-8a74-2334-6e83b59fcdca", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 619, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805670400, "block_size": 4096, "block_total": 65519099, "block_available": 63917400, "block_used": 1601699, "inode_total": 131070960, "inode_available": 131029139, "inode_used": 41821, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 11389 1726854848.83625: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854847.6262517-11461-187239997535993/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854848.83644: _low_level_execute_command(): starting 11389 1726854848.83646: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854847.6262517-11461-187239997535993/ > /dev/null 2>&1 && sleep 0' 11389 1726854848.84334: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854848.84357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854848.84458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854848.84461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854848.84465: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854848.84494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854848.84507: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854848.84516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854848.84608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854848.86608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854848.86630: stderr chunk (state=3): >>><<< 11389 1726854848.86634: stdout chunk (state=3): >>><<< 11389 1726854848.86652: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854848.86716: handler run complete 11389 1726854848.86802: variable 'ansible_facts' from source: unknown 11389 1726854848.86897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854848.87195: variable 'ansible_facts' from source: unknown 11389 1726854848.87266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854848.87458: attempt loop complete, returning result 11389 1726854848.87481: _execute() done 11389 1726854848.87573: dumping result to json 11389 1726854848.87578: done dumping result, returning 11389 1726854848.87581: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affcc66-ac2b-deb8-c119-0000000000cc] 11389 1726854848.87583: sending task result for task 0affcc66-ac2b-deb8-c119-0000000000cc 11389 1726854848.88442: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000000cc 11389 1726854848.88446: WORKER PROCESS EXITING ok: [managed_node3] 11389 1726854848.88869: no more pending results, returning what we have 11389 1726854848.88872: results queue empty 11389 1726854848.88873: checking for any_errors_fatal 11389 1726854848.88875: done checking for any_errors_fatal 11389 1726854848.88875: checking for max_fail_percentage 11389 1726854848.88877: done checking for max_fail_percentage 11389 1726854848.88877: checking to see if all hosts have failed and the running result is not ok 11389 1726854848.88878: done checking to see if all hosts have failed 11389 1726854848.88879: getting the remaining hosts for this loop 11389 1726854848.88880: done getting the remaining hosts for this loop 11389 1726854848.88884: getting the next task for host managed_node3 11389 1726854848.88892: done getting next task for host managed_node3 11389 1726854848.88893: ^ task is: TASK: meta (flush_handlers) 11389 1726854848.88895: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854848.88899: getting variables 11389 1726854848.88901: in VariableManager get_vars() 11389 1726854848.88923: Calling all_inventory to load vars for managed_node3 11389 1726854848.88926: Calling groups_inventory to load vars for managed_node3 11389 1726854848.88929: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854848.88938: Calling all_plugins_play to load vars for managed_node3 11389 1726854848.88941: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854848.88944: Calling groups_plugins_play to load vars for managed_node3 11389 1726854848.89144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854848.89344: done with get_vars() 11389 1726854848.89354: done getting variables 11389 1726854848.89423: in VariableManager get_vars() 11389 1726854848.89431: Calling all_inventory to load vars for managed_node3 11389 1726854848.89433: Calling groups_inventory to load vars for managed_node3 11389 1726854848.89435: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854848.89439: Calling all_plugins_play to load vars for managed_node3 11389 1726854848.89441: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854848.89444: Calling groups_plugins_play to load vars for managed_node3 11389 1726854848.89583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854848.89778: done with get_vars() 11389 1726854848.89792: done queuing things up, now waiting for results queue to drain 11389 1726854848.89794: results queue empty 11389 1726854848.89794: checking for any_errors_fatal 11389 1726854848.89796: done checking for any_errors_fatal 11389 1726854848.89797: checking for max_fail_percentage 11389 1726854848.89802: done checking for max_fail_percentage 11389 1726854848.89803: checking to see if all hosts have failed and the running result is not ok 11389 1726854848.89804: done checking to see if all hosts have failed 11389 1726854848.89804: getting the remaining hosts for this loop 11389 1726854848.89805: done getting the remaining hosts for this loop 11389 1726854848.89807: getting the next task for host managed_node3 11389 1726854848.89811: done getting next task for host managed_node3 11389 1726854848.89813: ^ task is: TASK: Include the task 'el_repo_setup.yml' 11389 1726854848.89815: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854848.89816: getting variables 11389 1726854848.89817: in VariableManager get_vars() 11389 1726854848.89824: Calling all_inventory to load vars for managed_node3 11389 1726854848.89826: Calling groups_inventory to load vars for managed_node3 11389 1726854848.89828: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854848.89832: Calling all_plugins_play to load vars for managed_node3 11389 1726854848.89834: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854848.89836: Calling groups_plugins_play to load vars for managed_node3 11389 1726854848.89990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854848.90181: done with get_vars() 11389 1726854848.90190: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:11 Friday 20 September 2024 13:54:08 -0400 (0:00:01.318) 0:00:01.325 ****** 11389 1726854848.90264: entering _queue_task() for managed_node3/include_tasks 11389 1726854848.90265: Creating lock for include_tasks 11389 1726854848.90586: worker is 1 (out of 1 available) 11389 1726854848.90609: exiting _queue_task() for managed_node3/include_tasks 11389 1726854848.90624: done queuing things up, now waiting for results queue to drain 11389 1726854848.90626: waiting for pending results... 11389 1726854848.90905: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 11389 1726854848.90944: in run() - task 0affcc66-ac2b-deb8-c119-000000000006 11389 1726854848.90963: variable 'ansible_search_path' from source: unknown 11389 1726854848.91006: calling self._execute() 11389 1726854848.91077: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854848.91090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854848.91108: variable 'omit' from source: magic vars 11389 1726854848.91392: _execute() done 11389 1726854848.91396: dumping result to json 11389 1726854848.91397: done dumping result, returning 11389 1726854848.91400: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [0affcc66-ac2b-deb8-c119-000000000006] 11389 1726854848.91401: sending task result for task 0affcc66-ac2b-deb8-c119-000000000006 11389 1726854848.91461: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000006 11389 1726854848.91463: WORKER PROCESS EXITING 11389 1726854848.91507: no more pending results, returning what we have 11389 1726854848.91511: in VariableManager get_vars() 11389 1726854848.91542: Calling all_inventory to load vars for managed_node3 11389 1726854848.91544: Calling groups_inventory to load vars for managed_node3 11389 1726854848.91548: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854848.91558: Calling all_plugins_play to load vars for managed_node3 11389 1726854848.91561: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854848.91564: Calling groups_plugins_play to load vars for managed_node3 11389 1726854848.91805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854848.92004: done with get_vars() 11389 1726854848.92011: variable 'ansible_search_path' from source: unknown 11389 1726854848.92023: we have included files to process 11389 1726854848.92024: generating all_blocks data 11389 1726854848.92026: done generating all_blocks data 11389 1726854848.92027: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11389 1726854848.92028: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11389 1726854848.92030: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11389 1726854848.92671: in VariableManager get_vars() 11389 1726854848.92688: done with get_vars() 11389 1726854848.92701: done processing included file 11389 1726854848.92703: iterating over new_blocks loaded from include file 11389 1726854848.92704: in VariableManager get_vars() 11389 1726854848.92714: done with get_vars() 11389 1726854848.92715: filtering new block on tags 11389 1726854848.92729: done filtering new block on tags 11389 1726854848.92732: in VariableManager get_vars() 11389 1726854848.92743: done with get_vars() 11389 1726854848.92744: filtering new block on tags 11389 1726854848.92760: done filtering new block on tags 11389 1726854848.92763: in VariableManager get_vars() 11389 1726854848.92776: done with get_vars() 11389 1726854848.92778: filtering new block on tags 11389 1726854848.92793: done filtering new block on tags 11389 1726854848.92795: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 11389 1726854848.92800: extending task lists for all hosts with included blocks 11389 1726854848.92846: done extending task lists 11389 1726854848.92847: done processing included files 11389 1726854848.92848: results queue empty 11389 1726854848.92848: checking for any_errors_fatal 11389 1726854848.92850: done checking for any_errors_fatal 11389 1726854848.92850: checking for max_fail_percentage 11389 1726854848.92851: done checking for max_fail_percentage 11389 1726854848.92852: checking to see if all hosts have failed and the running result is not ok 11389 1726854848.92853: done checking to see if all hosts have failed 11389 1726854848.92853: getting the remaining hosts for this loop 11389 1726854848.92854: done getting the remaining hosts for this loop 11389 1726854848.92857: getting the next task for host managed_node3 11389 1726854848.92861: done getting next task for host managed_node3 11389 1726854848.92863: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 11389 1726854848.92865: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854848.92871: getting variables 11389 1726854848.92872: in VariableManager get_vars() 11389 1726854848.92880: Calling all_inventory to load vars for managed_node3 11389 1726854848.92882: Calling groups_inventory to load vars for managed_node3 11389 1726854848.92885: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854848.92891: Calling all_plugins_play to load vars for managed_node3 11389 1726854848.92894: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854848.92897: Calling groups_plugins_play to load vars for managed_node3 11389 1726854848.93052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854848.93235: done with get_vars() 11389 1726854848.93243: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 13:54:08 -0400 (0:00:00.030) 0:00:01.356 ****** 11389 1726854848.93311: entering _queue_task() for managed_node3/setup 11389 1726854848.93561: worker is 1 (out of 1 available) 11389 1726854848.93573: exiting _queue_task() for managed_node3/setup 11389 1726854848.93584: done queuing things up, now waiting for results queue to drain 11389 1726854848.93586: waiting for pending results... 11389 1726854848.93826: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 11389 1726854848.93933: in run() - task 0affcc66-ac2b-deb8-c119-0000000000dd 11389 1726854848.93952: variable 'ansible_search_path' from source: unknown 11389 1726854848.93960: variable 'ansible_search_path' from source: unknown 11389 1726854848.94002: calling self._execute() 11389 1726854848.94078: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854848.94092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854848.94130: variable 'omit' from source: magic vars 11389 1726854848.94678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854848.96759: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854848.96831: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854848.96948: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854848.96951: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854848.96953: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854848.97022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854848.97061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854848.97093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854848.97139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854848.97158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854848.97343: variable 'ansible_facts' from source: unknown 11389 1726854848.97592: variable 'network_test_required_facts' from source: task vars 11389 1726854848.97596: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 11389 1726854848.97599: variable 'omit' from source: magic vars 11389 1726854848.97601: variable 'omit' from source: magic vars 11389 1726854848.97603: variable 'omit' from source: magic vars 11389 1726854848.97605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854848.97608: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854848.97617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854848.97639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854848.97655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854848.97690: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854848.97699: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854848.97708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854848.97815: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854848.97831: Set connection var ansible_timeout to 10 11389 1726854848.97838: Set connection var ansible_connection to ssh 11389 1726854848.97846: Set connection var ansible_shell_type to sh 11389 1726854848.97853: Set connection var ansible_pipelining to False 11389 1726854848.97862: Set connection var ansible_shell_executable to /bin/sh 11389 1726854848.97886: variable 'ansible_shell_executable' from source: unknown 11389 1726854848.97895: variable 'ansible_connection' from source: unknown 11389 1726854848.97901: variable 'ansible_module_compression' from source: unknown 11389 1726854848.97906: variable 'ansible_shell_type' from source: unknown 11389 1726854848.97910: variable 'ansible_shell_executable' from source: unknown 11389 1726854848.97915: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854848.97921: variable 'ansible_pipelining' from source: unknown 11389 1726854848.97926: variable 'ansible_timeout' from source: unknown 11389 1726854848.97935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854848.98085: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11389 1726854848.98105: variable 'omit' from source: magic vars 11389 1726854848.98160: starting attempt loop 11389 1726854848.98164: running the handler 11389 1726854848.98166: _low_level_execute_command(): starting 11389 1726854848.98168: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854848.98880: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854848.98936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854848.99003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854848.99036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854848.99061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854848.99104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854848.99163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854849.00883: stdout chunk (state=3): >>>/root <<< 11389 1726854849.01080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854849.01084: stdout chunk (state=3): >>><<< 11389 1726854849.01086: stderr chunk (state=3): >>><<< 11389 1726854849.01092: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854849.01100: _low_level_execute_command(): starting 11389 1726854849.01102: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854849.0104413-11517-54428328632087 `" && echo ansible-tmp-1726854849.0104413-11517-54428328632087="` echo /root/.ansible/tmp/ansible-tmp-1726854849.0104413-11517-54428328632087 `" ) && sleep 0' 11389 1726854849.01942: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854849.01956: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854849.01971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854849.01992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854849.02010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854849.02022: stderr chunk (state=3): >>>debug2: match not found <<< 11389 1726854849.02035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854849.02053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11389 1726854849.02130: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854849.02154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854849.02175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854849.02191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854849.02279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854849.04213: stdout chunk (state=3): >>>ansible-tmp-1726854849.0104413-11517-54428328632087=/root/.ansible/tmp/ansible-tmp-1726854849.0104413-11517-54428328632087 <<< 11389 1726854849.04315: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854849.04347: stderr chunk (state=3): >>><<< 11389 1726854849.04358: stdout chunk (state=3): >>><<< 11389 1726854849.04386: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854849.0104413-11517-54428328632087=/root/.ansible/tmp/ansible-tmp-1726854849.0104413-11517-54428328632087 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854849.04602: variable 'ansible_module_compression' from source: unknown 11389 1726854849.04605: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11389 1726854849.04608: variable 'ansible_facts' from source: unknown 11389 1726854849.04777: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854849.0104413-11517-54428328632087/AnsiballZ_setup.py 11389 1726854849.04951: Sending initial data 11389 1726854849.04960: Sent initial data (153 bytes) 11389 1726854849.05564: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854849.05580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854849.05602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854849.05621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854849.05709: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854849.05731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854849.05749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854849.05768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854849.05924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854849.07446: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854849.07499: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854849.07612: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmp3tr2e1je /root/.ansible/tmp/ansible-tmp-1726854849.0104413-11517-54428328632087/AnsiballZ_setup.py <<< 11389 1726854849.07616: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854849.0104413-11517-54428328632087/AnsiballZ_setup.py" <<< 11389 1726854849.07669: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmp3tr2e1je" to remote "/root/.ansible/tmp/ansible-tmp-1726854849.0104413-11517-54428328632087/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854849.0104413-11517-54428328632087/AnsiballZ_setup.py" <<< 11389 1726854849.10797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854849.10801: stdout chunk (state=3): >>><<< 11389 1726854849.10808: stderr chunk (state=3): >>><<< 11389 1726854849.10810: done transferring module to remote 11389 1726854849.10813: _low_level_execute_command(): starting 11389 1726854849.10815: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854849.0104413-11517-54428328632087/ /root/.ansible/tmp/ansible-tmp-1726854849.0104413-11517-54428328632087/AnsiballZ_setup.py && sleep 0' 11389 1726854849.12252: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854849.12298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854849.12462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854849.12536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854849.14428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854849.14439: stdout chunk (state=3): >>><<< 11389 1726854849.14449: stderr chunk (state=3): >>><<< 11389 1726854849.14469: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854849.14497: _low_level_execute_command(): starting 11389 1726854849.14508: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854849.0104413-11517-54428328632087/AnsiballZ_setup.py && sleep 0' 11389 1726854849.15675: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854849.15727: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854849.15854: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854849.15978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854849.16076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854849.18317: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 11389 1726854849.18385: stdout chunk (state=3): >>>import '_io' # <<< 11389 1726854849.18410: stdout chunk (state=3): >>>import 'marshal' # import 'posix' # <<< 11389 1726854849.18472: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # <<< 11389 1726854849.18529: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 11389 1726854849.18572: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854849.18623: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 11389 1726854849.18696: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527ca184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c9e7b30> <<< 11389 1726854849.18741: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527ca1aa50> <<< 11389 1726854849.18745: stdout chunk (state=3): >>>import '_signal' # <<< 11389 1726854849.18748: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 11389 1726854849.18750: stdout chunk (state=3): >>>import 'io' # <<< 11389 1726854849.18783: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 11389 1726854849.18861: stdout chunk (state=3): >>>import '_collections_abc' # <<< 11389 1726854849.18910: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 11389 1726854849.18944: stdout chunk (state=3): >>>import 'os' # <<< 11389 1726854849.18973: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 11389 1726854849.19005: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 11389 1726854849.19026: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 11389 1726854849.19029: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c82d130> <<< 11389 1726854849.19076: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 11389 1726854849.19114: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c82dfa0> <<< 11389 1726854849.19124: stdout chunk (state=3): >>>import 'site' # <<< 11389 1726854849.19153: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11389 1726854849.19560: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11389 1726854849.19563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 11389 1726854849.19589: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 11389 1726854849.19649: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11389 1726854849.19676: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 11389 1726854849.19697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c86be90> <<< 11389 1726854849.19720: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 11389 1726854849.19757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 11389 1726854849.19789: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c86bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11389 1726854849.19815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 11389 1726854849.19818: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 11389 1726854849.19907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854849.19912: stdout chunk (state=3): >>>import 'itertools' # <<< 11389 1726854849.19937: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8a3ec0> <<< 11389 1726854849.19961: stdout chunk (state=3): >>>import '_collections' # <<< 11389 1726854849.20017: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c883b60> import '_functools' # <<< 11389 1726854849.20039: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c881280> <<< 11389 1726854849.20136: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c869040> <<< 11389 1726854849.20150: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 11389 1726854849.20178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 11389 1726854849.20233: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 11389 1726854849.20258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 11389 1726854849.20294: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 11389 1726854849.20300: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8c37d0> <<< 11389 1726854849.20331: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8c23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 11389 1726854849.20367: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c882150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8c0c20> <<< 11389 1726854849.20413: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 11389 1726854849.20461: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8f8860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8682c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 11389 1726854849.20495: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c8f8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8f8bc0> <<< 11389 1726854849.20626: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c8f8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c866de0> <<< 11389 1726854849.20661: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8f9610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8f92e0> import 'importlib.machinery' # <<< 11389 1726854849.20794: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8fa510> import 'importlib.util' # import 'runpy' # <<< 11389 1726854849.20863: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c910710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c911df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 11389 1726854849.20877: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c912c90> <<< 11389 1726854849.20910: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c9132f0> <<< 11389 1726854849.20945: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c9121e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11389 1726854849.20977: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854849.21057: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c913d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c9134a0> <<< 11389 1726854849.21079: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8fa540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 11389 1726854849.21180: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c623bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 11389 1726854849.21217: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854849.21243: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c64c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c64c410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c64c6e0> <<< 11389 1726854849.21280: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11389 1726854849.21371: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854849.21485: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c64d010> <<< 11389 1726854849.21599: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c64da00> <<< 11389 1726854849.21625: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c64c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c621d90> <<< 11389 1726854849.21642: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11389 1726854849.21680: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 11389 1726854849.21697: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c64ee10> <<< 11389 1726854849.21731: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c64db50> <<< 11389 1726854849.21743: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8fac30> <<< 11389 1726854849.21769: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11389 1726854849.21982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 11389 1726854849.22041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c6771a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 11389 1726854849.22047: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c69b530> <<< 11389 1726854849.22067: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11389 1726854849.22112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11389 1726854849.22160: stdout chunk (state=3): >>>import 'ntpath' # <<< 11389 1726854849.22193: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c6fc290> <<< 11389 1726854849.22212: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 11389 1726854849.22249: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11389 1726854849.22284: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11389 1726854849.22342: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11389 1726854849.22372: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c6fe9f0> <<< 11389 1726854849.22455: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c6fc3b0> <<< 11389 1726854849.22481: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c6c12e0> <<< 11389 1726854849.22527: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 11389 1726854849.22538: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c5053a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c69a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c64fd70> <<< 11389 1726854849.22720: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11389 1726854849.22730: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f527c505640> <<< 11389 1726854849.23031: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_7foiieoa/ansible_setup_payload.zip' # zipimport: zlib available <<< 11389 1726854849.23157: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.23193: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 11389 1726854849.23196: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11389 1726854849.23229: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11389 1726854849.23312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 11389 1726854849.23333: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c56f080> <<< 11389 1726854849.23355: stdout chunk (state=3): >>>import '_typing' # <<< 11389 1726854849.23534: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c54df70> <<< 11389 1726854849.23561: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c54d100> # zipimport: zlib available <<< 11389 1726854849.23579: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 11389 1726854849.23625: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 11389 1726854849.25080: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.26128: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c56cf50> <<< 11389 1726854849.26182: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854849.26186: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 11389 1726854849.26190: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 11389 1726854849.26239: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 11389 1726854849.26243: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c59eab0> <<< 11389 1726854849.26268: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c59e840> <<< 11389 1726854849.26316: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c59e150> <<< 11389 1726854849.26339: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 11389 1726854849.26367: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c59e5a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c56fd10> <<< 11389 1726854849.26397: stdout chunk (state=3): >>>import 'atexit' # <<< 11389 1726854849.26412: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c59f830> <<< 11389 1726854849.26452: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c59fa70> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11389 1726854849.26503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 11389 1726854849.26515: stdout chunk (state=3): >>>import '_locale' # <<< 11389 1726854849.26565: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c59ffb0> <<< 11389 1726854849.26589: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11389 1726854849.26604: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11389 1726854849.26667: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf2dd60> <<< 11389 1726854849.26691: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bf2f4a0> <<< 11389 1726854849.26712: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 11389 1726854849.26782: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf302f0> <<< 11389 1726854849.26821: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11389 1726854849.26825: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 11389 1726854849.26827: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf31490> <<< 11389 1726854849.26829: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 11389 1726854849.26891: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 11389 1726854849.26897: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 11389 1726854849.26936: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf33f50> <<< 11389 1726854849.26996: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bf38290> <<< 11389 1726854849.27015: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf32240> <<< 11389 1726854849.27037: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 11389 1726854849.27062: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 11389 1726854849.27079: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 11389 1726854849.27096: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11389 1726854849.27244: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 11389 1726854849.27248: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf3bef0> import '_tokenize' # <<< 11389 1726854849.27379: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf3a9c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf3a750> <<< 11389 1726854849.27451: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11389 1726854849.27600: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf3ac90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf32720> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bf7ff80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf802f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 11389 1726854849.27612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bf81d00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf81ac0> <<< 11389 1726854849.27683: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11389 1726854849.27830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bf84260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf823c0> <<< 11389 1726854849.27835: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf879e0> <<< 11389 1726854849.27961: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf843b0> <<< 11389 1726854849.28240: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 11389 1726854849.28243: stdout chunk (state=3): >>> import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bf88830> <<< 11389 1726854849.28246: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bf88bf0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bf88cb0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf80380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527be10320> <<< 11389 1726854849.28362: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854849.28395: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527be112e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf8aab0> <<< 11389 1726854849.28424: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bf8be60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf8a6f0> # zipimport: zlib available <<< 11389 1726854849.28465: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 11389 1726854849.28552: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.28629: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.28831: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 11389 1726854849.28834: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 11389 1726854849.28853: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.28939: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.29459: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.29991: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 11389 1726854849.30016: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 11389 1726854849.30040: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854849.30097: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527be19520> <<< 11389 1726854849.30186: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 11389 1726854849.30214: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527be1a300> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527be11130> <<< 11389 1726854849.30320: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 11389 1726854849.30463: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.30610: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 11389 1726854849.30621: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527be1a330> # zipimport: zlib available <<< 11389 1726854849.31074: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.31507: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.31576: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.31650: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 11389 1726854849.31694: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.31730: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 11389 1726854849.31745: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.31800: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.31901: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 11389 1726854849.31911: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 11389 1726854849.31934: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.31959: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.32069: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 11389 1726854849.32314: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.32498: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11389 1726854849.32513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 11389 1726854849.32756: stdout chunk (state=3): >>>import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527be1b560> # zipimport: zlib available # zipimport: zlib available <<< 11389 1726854849.32779: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 11389 1726854849.32782: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 11389 1726854849.32785: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.32820: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.32860: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 11389 1726854849.32877: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.32909: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.32951: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.33058: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.33074: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11389 1726854849.33112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854849.33195: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 11389 1726854849.33221: stdout chunk (state=3): >>> import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527be26090> <<< 11389 1726854849.33406: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527be217c0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 11389 1726854849.33420: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11389 1726854849.33468: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854849.33490: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 11389 1726854849.33517: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 11389 1726854849.33584: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 11389 1726854849.33625: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11389 1726854849.33717: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf0ea50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c5ca720> <<< 11389 1726854849.33788: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527be26270> <<< 11389 1726854849.33846: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527be25e20> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 11389 1726854849.33895: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 11389 1726854849.33926: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 11389 1726854849.33953: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 11389 1726854849.34013: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.34079: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.34109: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.34113: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.34161: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.34197: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.34229: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.34272: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 11389 1726854849.34292: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.34345: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.34414: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.34434: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.34473: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 11389 1726854849.34649: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.34819: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.34858: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.34919: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 11389 1726854849.34945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 11389 1726854849.34986: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 11389 1726854849.34992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 11389 1726854849.35029: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527beb6180> <<< 11389 1726854849.35065: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 11389 1726854849.35068: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 11389 1726854849.35112: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 11389 1726854849.35116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 11389 1726854849.35146: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 11389 1726854849.35167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527ba68140> <<< 11389 1726854849.35205: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854849.35217: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527ba684a0> <<< 11389 1726854849.35258: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527be9c770> <<< 11389 1726854849.35296: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527beb6cc0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527beb48f0> <<< 11389 1726854849.35328: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527beb44a0> <<< 11389 1726854849.35332: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 11389 1726854849.35394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 11389 1726854849.35398: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 11389 1726854849.35427: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 11389 1726854849.35473: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527ba6b440> <<< 11389 1726854849.35490: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527ba6acf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527ba6aea0> <<< 11389 1726854849.35519: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527ba6a120> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 11389 1726854849.35657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 11389 1726854849.35668: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527ba6b500> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 11389 1726854849.35700: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 11389 1726854849.35735: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bab6000> <<< 11389 1726854849.35783: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527ba6bfe0> <<< 11389 1726854849.35813: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527beb4500> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 11389 1726854849.35848: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 11389 1726854849.35911: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.35958: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 11389 1726854849.35982: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.36025: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.36075: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 11389 1726854849.36109: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 11389 1726854849.36152: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.36203: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 11389 1726854849.36223: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11389 1726854849.36617: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 11389 1726854849.37079: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.37513: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 11389 1726854849.37561: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.37621: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.37646: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.37743: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available <<< 11389 1726854849.37768: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 11389 1726854849.37832: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.37880: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 11389 1726854849.37954: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 11389 1726854849.37982: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.38047: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 11389 1726854849.38064: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.38101: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.38186: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 11389 1726854849.38232: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bab60c0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 11389 1726854849.38271: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 11389 1726854849.38384: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bab6ae0> <<< 11389 1726854849.38426: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 11389 1726854849.38452: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.38543: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 11389 1726854849.38613: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.38808: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 11389 1726854849.38812: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.38853: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 11389 1726854849.38944: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.38962: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 11389 1726854849.38983: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 11389 1726854849.39062: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854849.39115: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527baf6360> <<< 11389 1726854849.39370: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bae7140> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 11389 1726854849.39385: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.39491: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 11389 1726854849.39518: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.39522: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.39595: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.39712: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.39952: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 11389 1726854849.39955: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.39991: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.40039: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 11389 1726854849.40084: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854849.40105: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bb09ee0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527baf5b50> import 'ansible.module_utils.facts.system.user' # <<< 11389 1726854849.40146: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 11389 1726854849.40184: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.40219: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 11389 1726854849.40246: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.40384: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.40527: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 11389 1726854849.40635: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.40807: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.40851: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 11389 1726854849.40866: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.40902: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.41017: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.41227: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 11389 1726854849.41349: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.41397: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 11389 1726854849.41415: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.41484: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.41501: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.42192: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.42548: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 11389 1726854849.42657: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.42749: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 11389 1726854849.42778: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.42874: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.42983: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 11389 1726854849.43176: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.43259: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 11389 1726854849.43316: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 11389 1726854849.43320: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.43344: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.43393: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 11389 1726854849.43496: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.43591: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.43843: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.44003: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 11389 1726854849.44040: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.44076: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 11389 1726854849.44110: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.44131: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 11389 1726854849.44151: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.44204: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.44278: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 11389 1726854849.44317: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.44342: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 11389 1726854849.44395: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.44454: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 11389 1726854849.44518: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.44582: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 11389 1726854849.44586: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.44838: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.45104: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 11389 1726854849.45160: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.45231: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 11389 1726854849.45250: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.45265: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.45297: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 11389 1726854849.45338: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.45358: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 11389 1726854849.45400: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.45405: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.45452: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 11389 1726854849.45527: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.45614: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 11389 1726854849.45650: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11389 1726854849.45667: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 11389 1726854849.45691: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.45743: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 11389 1726854849.45746: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.45785: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11389 1726854849.45828: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.45890: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.45946: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.46018: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 11389 1726854849.46038: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.46077: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.46132: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 11389 1726854849.46141: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.46331: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.46524: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 11389 1726854849.46546: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.46573: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.46625: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 11389 1726854849.46628: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.46676: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.46722: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 11389 1726854849.46807: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.46902: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 11389 1726854849.46916: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.46992: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.47079: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 11389 1726854849.47156: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.48191: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 11389 1726854849.48195: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854849.48229: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527b90b7a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527b908830> <<< 11389 1726854849.48260: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527b90a5a0> <<< 11389 1726854849.48638: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "54", "second": "09", "epoch": "1726854849", "epoch_int": "1726854849", "date": "2024-09-20", "time": "13:54:09", "iso8601_micro": "2024-09-20T17:54:09.472762Z", "iso8601": "2024-09-20T17:54:09Z", "iso8601_basic": "20240920T135409472762", "iso8601_basic_short": "20240920T135409", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCehZcRIiuho2g2VlWk6lYiySbVbpZPmaChpZJeIDeRDUTa1RCEnhGqH+DqSOr9XQgt/gHETb6HW1jwsrG3TM2y4UJqdcp3Vzdn2ceWAQSdC2hYxEDR7vD44mLY2TejKxXaN9WKywAwIRXdqXE3GJHR51KQe4kLYkzvhwwLpGlQwdZ5Tr4DTu6gsb5lUwzcvzk7gErzO/v2T4+jlj/bt7UDFkiASBXxhi+oZQYQAxOwOgM1BAGpl8GWX5nd5MFlFvztq2uV8Mra3ANc/7CgBxQOT9iCGpBsUXJ9UG35hNjY0xC5qa1XCoQbp0sbNhS4C+uvHspFzAqFLBx69tc5dbYXanTxy+MCFe9g8WNJpNFK9UNYAWwDdUIfkDHf3HqZtqnMO8FBVbSS6+K2NOdt9ZrQP4d9jVZxS7o3E37g6YSmVV+6OJZ8oTiSVe1wx3uYYpFCPbdbdOGkXpvU0oaDInwYl5PzhM1yjiWMsSqSvYHkCUAzsAv0Ws/L0t5uXSgTbCU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAAcVe0oQCsdWka9CinqxODLfzoA5WUkIscuWGu+0Pb9loUC4MBgDClPe5T0oztCcT0NSKld23Y2UFOZyAkaU+U=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICjU01xmt/yoMRnNQ5IgfXwC8CabJN267FXBGFtFm2PC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 52416 10.31.9.244 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 52416 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_syst<<< 11389 1726854849.48642: stdout chunk (state=3): >>>em_capabilities": [], "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-244", "ansible_nodename": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2bc2acdd478a7423346e83b59fcdca", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11389 1726854849.49261: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 <<< 11389 1726854849.49268: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings <<< 11389 1726854849.49336: stdout chunk (state=3): >>># cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre <<< 11389 1726854849.49350: stdout chunk (state=3): >>># cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap <<< 11389 1726854849.49355: stdout chunk (state=3): >>># cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma <<< 11389 1726854849.49358: stdout chunk (state=3): >>># cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref <<< 11389 1726854849.49413: stdout chunk (state=3): >>># cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache <<< 11389 1726854849.49417: stdout chunk (state=3): >>># cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal <<< 11389 1726854849.49437: stdout chunk (state=3): >>># cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text <<< 11389 1726854849.49458: stdout chunk (state=3): >>># destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool <<< 11389 1726854849.49484: stdout chunk (state=3): >>># cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle<<< 11389 1726854849.49510: stdout chunk (state=3): >>> # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime <<< 11389 1726854849.49571: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors <<< 11389 1726854849.49592: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl <<< 11389 1726854849.49622: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 11389 1726854849.49980: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11389 1726854849.49989: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 11389 1726854849.50046: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 11389 1726854849.50065: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 11389 1726854849.50127: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 11389 1726854849.50159: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale <<< 11389 1726854849.50197: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 11389 1726854849.50228: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 11389 1726854849.50239: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 11389 1726854849.50298: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal <<< 11389 1726854849.50355: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata <<< 11389 1726854849.50360: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 11389 1726854849.50413: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 11389 1726854849.50416: stdout chunk (state=3): >>># destroy _ssl <<< 11389 1726854849.50450: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json <<< 11389 1726854849.50480: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 11389 1726854849.50501: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing<<< 11389 1726854849.50509: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 11389 1726854849.50562: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian <<< 11389 1726854849.50579: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 11389 1726854849.50627: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing <<< 11389 1726854849.50630: stdout chunk (state=3): >>># cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix <<< 11389 1726854849.50672: stdout chunk (state=3): >>># destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 11389 1726854849.50698: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io <<< 11389 1726854849.50732: stdout chunk (state=3): >>># cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 11389 1726854849.50742: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11389 1726854849.50898: stdout chunk (state=3): >>># destroy sys.monitoring <<< 11389 1726854849.50908: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 11389 1726854849.50941: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 11389 1726854849.50953: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 11389 1726854849.50991: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 11389 1726854849.51039: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 11389 1726854849.51042: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io <<< 11389 1726854849.51068: stdout chunk (state=3): >>># destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11389 1726854849.51170: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 11389 1726854849.51212: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 11389 1726854849.51217: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 11389 1726854849.51294: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools <<< 11389 1726854849.51298: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11389 1726854849.51758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854849.51762: stdout chunk (state=3): >>><<< 11389 1726854849.51764: stderr chunk (state=3): >>><<< 11389 1726854849.52006: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527ca184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c9e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527ca1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c82d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c82dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c86be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c86bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8a3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c883b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c881280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c869040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8c23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c882150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8c0c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8f8860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8682c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c8f8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8f8bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c8f8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c866de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8f9610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8f92e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8fa510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c910710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c911df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c912c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c9132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c9121e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c913d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c9134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8fa540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c623bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c64c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c64c410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c64c6e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c64d010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c64da00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c64c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c621d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c64ee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c64db50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c8fac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c6771a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c69b530> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c6fc290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c6fe9f0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c6fc3b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c6c12e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c5053a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c69a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c64fd70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f527c505640> # zipimport: found 103 names in '/tmp/ansible_setup_payload_7foiieoa/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c56f080> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c54df70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c54d100> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c56cf50> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c59eab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c59e840> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c59e150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c59e5a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c56fd10> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c59f830> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527c59fa70> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c59ffb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf2dd60> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bf2f4a0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf302f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf31490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf33f50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bf38290> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf32240> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf3bef0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf3a9c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf3a750> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf3ac90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf32720> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bf7ff80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf802f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bf81d00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf81ac0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bf84260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf823c0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf879e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf843b0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bf88830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bf88bf0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bf88cb0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf80380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527be10320> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527be112e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf8aab0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bf8be60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf8a6f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527be19520> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527be1a300> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527be11130> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527be1a330> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527be1b560> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527be26090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527be217c0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bf0ea50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527c5ca720> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527be26270> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527be25e20> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527beb6180> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527ba68140> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527ba684a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527be9c770> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527beb6cc0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527beb48f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527beb44a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527ba6b440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527ba6acf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527ba6aea0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527ba6a120> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527ba6b500> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bab6000> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527ba6bfe0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527beb4500> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bab60c0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bab6ae0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527baf6360> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527bae7140> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527bb09ee0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527baf5b50> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f527b90b7a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527b908830> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f527b90a5a0> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "54", "second": "09", "epoch": "1726854849", "epoch_int": "1726854849", "date": "2024-09-20", "time": "13:54:09", "iso8601_micro": "2024-09-20T17:54:09.472762Z", "iso8601": "2024-09-20T17:54:09Z", "iso8601_basic": "20240920T135409472762", "iso8601_basic_short": "20240920T135409", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCehZcRIiuho2g2VlWk6lYiySbVbpZPmaChpZJeIDeRDUTa1RCEnhGqH+DqSOr9XQgt/gHETb6HW1jwsrG3TM2y4UJqdcp3Vzdn2ceWAQSdC2hYxEDR7vD44mLY2TejKxXaN9WKywAwIRXdqXE3GJHR51KQe4kLYkzvhwwLpGlQwdZ5Tr4DTu6gsb5lUwzcvzk7gErzO/v2T4+jlj/bt7UDFkiASBXxhi+oZQYQAxOwOgM1BAGpl8GWX5nd5MFlFvztq2uV8Mra3ANc/7CgBxQOT9iCGpBsUXJ9UG35hNjY0xC5qa1XCoQbp0sbNhS4C+uvHspFzAqFLBx69tc5dbYXanTxy+MCFe9g8WNJpNFK9UNYAWwDdUIfkDHf3HqZtqnMO8FBVbSS6+K2NOdt9ZrQP4d9jVZxS7o3E37g6YSmVV+6OJZ8oTiSVe1wx3uYYpFCPbdbdOGkXpvU0oaDInwYl5PzhM1yjiWMsSqSvYHkCUAzsAv0Ws/L0t5uXSgTbCU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAAcVe0oQCsdWka9CinqxODLfzoA5WUkIscuWGu+0Pb9loUC4MBgDClPe5T0oztCcT0NSKld23Y2UFOZyAkaU+U=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICjU01xmt/yoMRnNQ5IgfXwC8CabJN267FXBGFtFm2PC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 52416 10.31.9.244 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 52416 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-244", "ansible_nodename": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2bc2acdd478a7423346e83b59fcdca", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 11389 1726854849.53509: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854849.0104413-11517-54428328632087/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854849.53512: _low_level_execute_command(): starting 11389 1726854849.53515: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854849.0104413-11517-54428328632087/ > /dev/null 2>&1 && sleep 0' 11389 1726854849.53517: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854849.53520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854849.53522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854849.53524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854849.53526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854849.53528: stderr chunk (state=3): >>>debug2: match not found <<< 11389 1726854849.53530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854849.53532: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11389 1726854849.53533: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 11389 1726854849.53535: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11389 1726854849.53537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854849.53539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854849.53541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854849.53543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854849.53545: stderr chunk (state=3): >>>debug2: match found <<< 11389 1726854849.53547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854849.53549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854849.53550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854849.53552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854849.53554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854849.55494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854849.55498: stderr chunk (state=3): >>><<< 11389 1726854849.55500: stdout chunk (state=3): >>><<< 11389 1726854849.55503: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854849.55505: handler run complete 11389 1726854849.55507: variable 'ansible_facts' from source: unknown 11389 1726854849.55564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854849.55697: variable 'ansible_facts' from source: unknown 11389 1726854849.55772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854849.55832: attempt loop complete, returning result 11389 1726854849.55839: _execute() done 11389 1726854849.55844: dumping result to json 11389 1726854849.55874: done dumping result, returning 11389 1726854849.55885: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcc66-ac2b-deb8-c119-0000000000dd] 11389 1726854849.55895: sending task result for task 0affcc66-ac2b-deb8-c119-0000000000dd ok: [managed_node3] 11389 1726854849.56172: no more pending results, returning what we have 11389 1726854849.56175: results queue empty 11389 1726854849.56176: checking for any_errors_fatal 11389 1726854849.56178: done checking for any_errors_fatal 11389 1726854849.56179: checking for max_fail_percentage 11389 1726854849.56180: done checking for max_fail_percentage 11389 1726854849.56181: checking to see if all hosts have failed and the running result is not ok 11389 1726854849.56182: done checking to see if all hosts have failed 11389 1726854849.56183: getting the remaining hosts for this loop 11389 1726854849.56184: done getting the remaining hosts for this loop 11389 1726854849.56189: getting the next task for host managed_node3 11389 1726854849.56199: done getting next task for host managed_node3 11389 1726854849.56202: ^ task is: TASK: Check if system is ostree 11389 1726854849.56205: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854849.56209: getting variables 11389 1726854849.56210: in VariableManager get_vars() 11389 1726854849.56241: Calling all_inventory to load vars for managed_node3 11389 1726854849.56245: Calling groups_inventory to load vars for managed_node3 11389 1726854849.56248: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854849.56261: Calling all_plugins_play to load vars for managed_node3 11389 1726854849.56265: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854849.56271: Calling groups_plugins_play to load vars for managed_node3 11389 1726854849.56816: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000000dd 11389 1726854849.56819: WORKER PROCESS EXITING 11389 1726854849.56850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854849.57065: done with get_vars() 11389 1726854849.57078: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 13:54:09 -0400 (0:00:00.638) 0:00:01.994 ****** 11389 1726854849.57193: entering _queue_task() for managed_node3/stat 11389 1726854849.57680: worker is 1 (out of 1 available) 11389 1726854849.57694: exiting _queue_task() for managed_node3/stat 11389 1726854849.57704: done queuing things up, now waiting for results queue to drain 11389 1726854849.57705: waiting for pending results... 11389 1726854849.57907: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 11389 1726854849.58025: in run() - task 0affcc66-ac2b-deb8-c119-0000000000df 11389 1726854849.58053: variable 'ansible_search_path' from source: unknown 11389 1726854849.58060: variable 'ansible_search_path' from source: unknown 11389 1726854849.58102: calling self._execute() 11389 1726854849.58181: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854849.58195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854849.58210: variable 'omit' from source: magic vars 11389 1726854849.58736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854849.59011: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854849.59074: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854849.59148: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854849.59193: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854849.59291: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854849.59322: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854849.59361: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854849.59399: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854849.59543: Evaluated conditional (not __network_is_ostree is defined): True 11389 1726854849.59555: variable 'omit' from source: magic vars 11389 1726854849.59613: variable 'omit' from source: magic vars 11389 1726854849.59653: variable 'omit' from source: magic vars 11389 1726854849.59696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854849.59727: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854849.59749: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854849.59774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854849.59798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854849.59831: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854849.59841: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854849.59849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854849.59965: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854849.60015: Set connection var ansible_timeout to 10 11389 1726854849.60018: Set connection var ansible_connection to ssh 11389 1726854849.60020: Set connection var ansible_shell_type to sh 11389 1726854849.60022: Set connection var ansible_pipelining to False 11389 1726854849.60025: Set connection var ansible_shell_executable to /bin/sh 11389 1726854849.60042: variable 'ansible_shell_executable' from source: unknown 11389 1726854849.60051: variable 'ansible_connection' from source: unknown 11389 1726854849.60093: variable 'ansible_module_compression' from source: unknown 11389 1726854849.60096: variable 'ansible_shell_type' from source: unknown 11389 1726854849.60099: variable 'ansible_shell_executable' from source: unknown 11389 1726854849.60101: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854849.60103: variable 'ansible_pipelining' from source: unknown 11389 1726854849.60105: variable 'ansible_timeout' from source: unknown 11389 1726854849.60107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854849.60261: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11389 1726854849.60280: variable 'omit' from source: magic vars 11389 1726854849.60343: starting attempt loop 11389 1726854849.60346: running the handler 11389 1726854849.60348: _low_level_execute_command(): starting 11389 1726854849.60351: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854849.61053: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854849.61069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854849.61097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854849.61121: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11389 1726854849.61225: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854849.61250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854849.61346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854849.63028: stdout chunk (state=3): >>>/root <<< 11389 1726854849.63204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854849.63207: stdout chunk (state=3): >>><<< 11389 1726854849.63210: stderr chunk (state=3): >>><<< 11389 1726854849.63230: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854849.63256: _low_level_execute_command(): starting 11389 1726854849.63344: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854849.632442-11552-265773918633970 `" && echo ansible-tmp-1726854849.632442-11552-265773918633970="` echo /root/.ansible/tmp/ansible-tmp-1726854849.632442-11552-265773918633970 `" ) && sleep 0' 11389 1726854849.64004: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 11389 1726854849.64022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854849.64114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854849.64155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854849.64246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854849.66193: stdout chunk (state=3): >>>ansible-tmp-1726854849.632442-11552-265773918633970=/root/.ansible/tmp/ansible-tmp-1726854849.632442-11552-265773918633970 <<< 11389 1726854849.66338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854849.66342: stdout chunk (state=3): >>><<< 11389 1726854849.66344: stderr chunk (state=3): >>><<< 11389 1726854849.66358: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854849.632442-11552-265773918633970=/root/.ansible/tmp/ansible-tmp-1726854849.632442-11552-265773918633970 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854849.66494: variable 'ansible_module_compression' from source: unknown 11389 1726854849.66498: ANSIBALLZ: Using lock for stat 11389 1726854849.66500: ANSIBALLZ: Acquiring lock 11389 1726854849.66502: ANSIBALLZ: Lock acquired: 140464425327248 11389 1726854849.66504: ANSIBALLZ: Creating module 11389 1726854849.79147: ANSIBALLZ: Writing module into payload 11389 1726854849.79260: ANSIBALLZ: Writing module 11389 1726854849.79291: ANSIBALLZ: Renaming module 11389 1726854849.79303: ANSIBALLZ: Done creating module 11389 1726854849.79343: variable 'ansible_facts' from source: unknown 11389 1726854849.79417: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854849.632442-11552-265773918633970/AnsiballZ_stat.py 11389 1726854849.79703: Sending initial data 11389 1726854849.79706: Sent initial data (152 bytes) 11389 1726854849.80504: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854849.80525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854849.80551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854849.80577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854849.80683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854849.82344: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854849.82406: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854849.82518: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmpd6p09fwb /root/.ansible/tmp/ansible-tmp-1726854849.632442-11552-265773918633970/AnsiballZ_stat.py <<< 11389 1726854849.82521: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854849.632442-11552-265773918633970/AnsiballZ_stat.py" <<< 11389 1726854849.82576: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmpd6p09fwb" to remote "/root/.ansible/tmp/ansible-tmp-1726854849.632442-11552-265773918633970/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854849.632442-11552-265773918633970/AnsiballZ_stat.py" <<< 11389 1726854849.83999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854849.84003: stdout chunk (state=3): >>><<< 11389 1726854849.84005: stderr chunk (state=3): >>><<< 11389 1726854849.84009: done transferring module to remote 11389 1726854849.84034: _low_level_execute_command(): starting 11389 1726854849.84046: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854849.632442-11552-265773918633970/ /root/.ansible/tmp/ansible-tmp-1726854849.632442-11552-265773918633970/AnsiballZ_stat.py && sleep 0' 11389 1726854849.85252: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854849.85256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854849.85519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854849.85563: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854849.85657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854849.87868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854849.87871: stdout chunk (state=3): >>><<< 11389 1726854849.87873: stderr chunk (state=3): >>><<< 11389 1726854849.87876: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854849.87878: _low_level_execute_command(): starting 11389 1726854849.87879: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854849.632442-11552-265773918633970/AnsiballZ_stat.py && sleep 0' 11389 1726854849.88409: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854849.88422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854849.88436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854849.88502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854849.88549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854849.88576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854849.88597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854849.88723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854849.91131: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 11389 1726854849.91194: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854849.91212: stdout chunk (state=3): >>>import '_codecs' # <<< 11389 1726854849.91238: stdout chunk (state=3): >>>import 'codecs' # <<< 11389 1726854849.91257: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11389 1726854849.91340: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bdbbc4d0> <<< 11389 1726854849.91354: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bdb8bb00> <<< 11389 1726854849.91392: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bdbbea50> import '_signal' # import '_abc' # import 'abc' # <<< 11389 1726854849.91409: stdout chunk (state=3): >>>import 'io' # <<< 11389 1726854849.91446: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 11389 1726854849.91527: stdout chunk (state=3): >>>import '_collections_abc' # <<< 11389 1726854849.91596: stdout chunk (state=3): >>>import 'genericpath' # <<< 11389 1726854849.91613: stdout chunk (state=3): >>>import 'posixpath' # <<< 11389 1726854849.91641: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 11389 1726854849.91791: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 11389 1726854849.91820: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bdbcd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bdbcdfa0> <<< 11389 1726854849.91823: stdout chunk (state=3): >>>import 'site' # <<< 11389 1726854849.91844: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11389 1726854849.92064: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11389 1726854849.92108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 11389 1726854849.92129: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 11389 1726854849.92184: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11389 1726854849.92203: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 11389 1726854849.92231: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd9cbe60> <<< 11389 1726854849.92251: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 11389 1726854849.92277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 11389 1726854849.92328: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd9cbef0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11389 1726854849.92359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 11389 1726854849.92379: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 11389 1726854849.92533: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854849.92537: stdout chunk (state=3): >>>import 'itertools' # <<< 11389 1726854849.92540: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda03860> <<< 11389 1726854849.92542: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda03ef0> <<< 11389 1726854849.92552: stdout chunk (state=3): >>>import '_collections' # <<< 11389 1726854849.92608: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd9e3b30> <<< 11389 1726854849.92624: stdout chunk (state=3): >>>import '_functools' # <<< 11389 1726854849.92627: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd9e1220> <<< 11389 1726854849.92706: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd9c9010> <<< 11389 1726854849.92783: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 11389 1726854849.92786: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 11389 1726854849.92856: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 11389 1726854849.92859: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 11389 1726854849.92970: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda237a0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda223c0> <<< 11389 1726854849.92998: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd9e20f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd9ca8d0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda587d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd9c8290> <<< 11389 1726854849.93020: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 11389 1726854849.93103: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bda58c80> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda58b30> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bda58f20> <<< 11389 1726854849.93115: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd9c6db0> <<< 11389 1726854849.93182: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 11389 1726854849.93209: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 11389 1726854849.93240: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda595e0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda592b0> import 'importlib.machinery' # <<< 11389 1726854849.93266: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 11389 1726854849.93315: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda5a4b0> <<< 11389 1726854849.93319: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 11389 1726854849.93338: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11389 1726854849.93360: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 11389 1726854849.93422: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda706b0> <<< 11389 1726854849.93426: stdout chunk (state=3): >>>import 'errno' # <<< 11389 1726854849.93473: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bda71d60> <<< 11389 1726854849.93477: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 11389 1726854849.93534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 11389 1726854849.93537: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 11389 1726854849.93563: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda72c00> <<< 11389 1726854849.93582: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bda73260> <<< 11389 1726854849.93611: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda72150> <<< 11389 1726854849.93630: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11389 1726854849.93660: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854849.93671: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bda73ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda73410> <<< 11389 1726854849.93743: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda5a420> <<< 11389 1726854849.93746: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 11389 1726854849.93773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 11389 1726854849.93784: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 11389 1726854849.93810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11389 1726854849.93867: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd7fbc50> <<< 11389 1726854849.93877: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 11389 1726854849.93930: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd8246b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd824410> <<< 11389 1726854849.93992: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd8246e0> <<< 11389 1726854849.93999: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11389 1726854849.94055: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854849.94225: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd825010> <<< 11389 1726854849.94300: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd825a00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd8248c0> <<< 11389 1726854849.94324: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd7f9df0> <<< 11389 1726854849.94368: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11389 1726854849.94407: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd826e10> <<< 11389 1726854849.94428: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd825b50> <<< 11389 1726854849.94475: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda5abd0> <<< 11389 1726854849.94478: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11389 1726854849.94563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854849.94620: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 11389 1726854849.94663: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd84f1a0> <<< 11389 1726854849.94704: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 11389 1726854849.94734: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 11389 1726854849.94826: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd873500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11389 1726854849.94844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11389 1726854849.95123: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd8d4230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd8d6990> <<< 11389 1726854849.95186: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd8d4350> <<< 11389 1726854849.95230: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd899250> <<< 11389 1726854849.95259: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd121370> <<< 11389 1726854849.95280: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd872300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd827d70> <<< 11389 1726854849.95393: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11389 1726854849.95415: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f45bd121610> <<< 11389 1726854849.95664: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_yzf0u_xa/ansible_stat_payload.zip' # zipimport: zlib available <<< 11389 1726854849.95795: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.95821: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11389 1726854849.95876: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11389 1726854849.95941: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 11389 1726854849.95986: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd1770b0> import '_typing' # <<< 11389 1726854849.96173: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd155fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd155130> # zipimport: zlib available <<< 11389 1726854849.96230: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 11389 1726854849.96313: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 11389 1726854849.96316: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.97672: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854849.98800: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd174e00> <<< 11389 1726854849.98930: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854849.98964: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd19e9f0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd19e780> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd19e090> <<< 11389 1726854849.98991: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 11389 1726854849.99035: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd19e4e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd177b30> <<< 11389 1726854849.99065: stdout chunk (state=3): >>>import 'atexit' # <<< 11389 1726854849.99156: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd19f770> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854849.99186: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd19f950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 11389 1726854849.99231: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd19fe90> <<< 11389 1726854849.99273: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11389 1726854849.99395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11389 1726854849.99398: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd009c70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd00b890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 11389 1726854849.99420: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd00c290> <<< 11389 1726854849.99447: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11389 1726854849.99489: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 11389 1726854849.99545: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd00d430> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 11389 1726854849.99577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 11389 1726854849.99629: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd00fec0> <<< 11389 1726854849.99700: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd157140> <<< 11389 1726854849.99711: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd00e180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 11389 1726854849.99797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 11389 1726854849.99839: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 11389 1726854849.99933: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd017e60> import '_tokenize' # <<< 11389 1726854849.99952: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd016930> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd016690> <<< 11389 1726854849.99972: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11389 1726854850.00038: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd016c00> <<< 11389 1726854850.00084: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd00e690> <<< 11389 1726854850.00143: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd05f9e0> <<< 11389 1726854850.00161: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd060170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 11389 1726854850.00190: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 11389 1726854850.00355: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd061be0> <<< 11389 1726854850.00369: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd0619a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 11389 1726854850.00413: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854850.00437: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd0641a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd0622d0> <<< 11389 1726854850.00461: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11389 1726854850.00508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854850.00530: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 11389 1726854850.00573: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd067980> <<< 11389 1726854850.00713: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd064350> <<< 11389 1726854850.00749: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 11389 1726854850.00775: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd068770> <<< 11389 1726854850.00804: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd068830> <<< 11389 1726854850.00848: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd068ad0> <<< 11389 1726854850.00854: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd060350> <<< 11389 1726854850.00914: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 11389 1726854850.01031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 11389 1726854850.01045: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd0f4290> <<< 11389 1726854850.01117: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd0f5610> <<< 11389 1726854850.01169: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd06aa20> <<< 11389 1726854850.01198: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd06bdd0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd06a690> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 11389 1726854850.01258: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.01306: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.01398: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.01480: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 11389 1726854850.01584: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.01703: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.02230: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.02790: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 11389 1726854850.02884: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854850.02910: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd0f9730> <<< 11389 1726854850.02954: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 11389 1726854850.02973: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd0fa450> <<< 11389 1726854850.03118: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd0f5730> <<< 11389 1726854850.03121: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 11389 1726854850.03243: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.03397: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 11389 1726854850.03421: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd0fa150> # zipimport: zlib available <<< 11389 1726854850.03975: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.04336: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.04418: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.04485: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 11389 1726854850.04542: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11389 1726854850.04565: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 11389 1726854850.04756: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11389 1726854850.04762: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 11389 1726854850.04764: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.04778: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 11389 1726854850.04834: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.04865: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 11389 1726854850.04876: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.05095: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.05330: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11389 1726854850.05396: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 11389 1726854850.05490: stdout chunk (state=3): >>>import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd0fb500> # zipimport: zlib available <<< 11389 1726854850.05559: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.05631: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 11389 1726854850.05650: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 11389 1726854850.05698: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.05814: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 11389 1726854850.05857: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.05918: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.06030: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11389 1726854850.06054: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11389 1726854850.06159: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bcf06180> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bcf00d40> <<< 11389 1726854850.06190: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 11389 1726854850.06267: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.06322: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.06380: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.06418: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 11389 1726854850.06489: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 11389 1726854850.06595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 11389 1726854850.06602: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11389 1726854850.06656: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd1d6a50> <<< 11389 1726854850.06675: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd1ee720> <<< 11389 1726854850.06784: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bcf05f40> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd0f59d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 11389 1726854850.06832: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 11389 1726854850.06872: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 11389 1726854850.06936: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 11389 1726854850.07123: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.07126: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.07250: stdout chunk (state=3): >>># zipimport: zlib available <<< 11389 1726854850.07441: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 11389 1726854850.07461: stdout chunk (state=3): >>># destroy __main__ <<< 11389 1726854850.07712: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback <<< 11389 1726854850.07740: stdout chunk (state=3): >>># clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ <<< 11389 1726854850.07788: stdout chunk (state=3): >>># cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset <<< 11389 1726854850.07834: stdout chunk (state=3): >>># destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime <<< 11389 1726854850.08003: stdout chunk (state=3): >>># cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process<<< 11389 1726854850.08043: stdout chunk (state=3): >>> # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 11389 1726854850.08175: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11389 1726854850.08224: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 11389 1726854850.08247: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 11389 1726854850.08344: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 11389 1726854850.08427: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 11389 1726854850.08525: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 11389 1726854850.08538: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math <<< 11389 1726854850.08668: stdout chunk (state=3): >>># cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11389 1726854850.08710: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 11389 1726854850.08734: stdout chunk (state=3): >>># destroy _collections <<< 11389 1726854850.08776: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 11389 1726854850.08817: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 11389 1726854850.08921: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 11389 1726854850.09007: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref <<< 11389 1726854850.09110: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _string # destroy re <<< 11389 1726854850.09113: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11389 1726854850.09623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854850.09626: stdout chunk (state=3): >>><<< 11389 1726854850.09628: stderr chunk (state=3): >>><<< 11389 1726854850.09817: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bdbbc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bdb8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bdbbea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bdbcd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bdbcdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd9cbe60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd9cbef0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda03860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda03ef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd9e3b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd9e1220> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd9c9010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda237a0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda223c0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd9e20f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd9ca8d0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda587d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd9c8290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bda58c80> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda58b30> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bda58f20> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd9c6db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda595e0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda592b0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda5a4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda706b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bda71d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda72c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bda73260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda72150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bda73ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda73410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda5a420> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd7fbc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd8246b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd824410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd8246e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd825010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd825a00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd8248c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd7f9df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd826e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd825b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bda5abd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd84f1a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd873500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd8d4230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd8d6990> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd8d4350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd899250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd121370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd872300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd827d70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f45bd121610> # zipimport: found 30 names in '/tmp/ansible_stat_payload_yzf0u_xa/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd1770b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd155fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd155130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd174e00> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd19e9f0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd19e780> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd19e090> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd19e4e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd177b30> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd19f770> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd19f950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd19fe90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd009c70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd00b890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd00c290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd00d430> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd00fec0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd157140> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd00e180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd017e60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd016930> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd016690> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd016c00> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd00e690> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd05f9e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd060170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd061be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd0619a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd0641a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd0622d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd067980> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd064350> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd068770> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd068830> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd068ad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd060350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd0f4290> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd0f5610> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd06aa20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd06bdd0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd06a690> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bd0f9730> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd0fa450> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd0f5730> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd0fa150> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd0fb500> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f45bcf06180> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bcf00d40> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd1d6a50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd1ee720> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bcf05f40> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f45bd0f59d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 11389 1726854850.10910: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854849.632442-11552-265773918633970/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854850.10914: _low_level_execute_command(): starting 11389 1726854850.10916: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854849.632442-11552-265773918633970/ > /dev/null 2>&1 && sleep 0' 11389 1726854850.12065: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854850.12068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854850.12071: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 11389 1726854850.12074: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854850.12076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854850.12278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854850.12282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854850.12360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854850.12443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854850.14317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854850.14411: stderr chunk (state=3): >>><<< 11389 1726854850.14415: stdout chunk (state=3): >>><<< 11389 1726854850.14448: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854850.14468: handler run complete 11389 1726854850.14534: attempt loop complete, returning result 11389 1726854850.14541: _execute() done 11389 1726854850.14548: dumping result to json 11389 1726854850.14557: done dumping result, returning 11389 1726854850.14569: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [0affcc66-ac2b-deb8-c119-0000000000df] 11389 1726854850.14579: sending task result for task 0affcc66-ac2b-deb8-c119-0000000000df ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 11389 1726854850.14777: no more pending results, returning what we have 11389 1726854850.14781: results queue empty 11389 1726854850.14782: checking for any_errors_fatal 11389 1726854850.14791: done checking for any_errors_fatal 11389 1726854850.14792: checking for max_fail_percentage 11389 1726854850.14794: done checking for max_fail_percentage 11389 1726854850.14795: checking to see if all hosts have failed and the running result is not ok 11389 1726854850.14796: done checking to see if all hosts have failed 11389 1726854850.14796: getting the remaining hosts for this loop 11389 1726854850.14798: done getting the remaining hosts for this loop 11389 1726854850.14802: getting the next task for host managed_node3 11389 1726854850.14809: done getting next task for host managed_node3 11389 1726854850.14812: ^ task is: TASK: Set flag to indicate system is ostree 11389 1726854850.14815: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854850.14820: getting variables 11389 1726854850.14821: in VariableManager get_vars() 11389 1726854850.14854: Calling all_inventory to load vars for managed_node3 11389 1726854850.14858: Calling groups_inventory to load vars for managed_node3 11389 1726854850.14862: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854850.14873: Calling all_plugins_play to load vars for managed_node3 11389 1726854850.14877: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854850.14880: Calling groups_plugins_play to load vars for managed_node3 11389 1726854850.15416: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000000df 11389 1726854850.15419: WORKER PROCESS EXITING 11389 1726854850.15443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854850.15636: done with get_vars() 11389 1726854850.15650: done getting variables 11389 1726854850.15735: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 13:54:10 -0400 (0:00:00.585) 0:00:02.580 ****** 11389 1726854850.15768: entering _queue_task() for managed_node3/set_fact 11389 1726854850.15770: Creating lock for set_fact 11389 1726854850.16045: worker is 1 (out of 1 available) 11389 1726854850.16057: exiting _queue_task() for managed_node3/set_fact 11389 1726854850.16069: done queuing things up, now waiting for results queue to drain 11389 1726854850.16070: waiting for pending results... 11389 1726854850.16290: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 11389 1726854850.16426: in run() - task 0affcc66-ac2b-deb8-c119-0000000000e0 11389 1726854850.16455: variable 'ansible_search_path' from source: unknown 11389 1726854850.16471: variable 'ansible_search_path' from source: unknown 11389 1726854850.16516: calling self._execute() 11389 1726854850.16650: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854850.16653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854850.16656: variable 'omit' from source: magic vars 11389 1726854850.17333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854850.17790: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854850.17878: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854850.17917: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854850.17993: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854850.18493: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854850.18497: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854850.18501: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854850.18502: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854850.18678: Evaluated conditional (not __network_is_ostree is defined): True 11389 1726854850.18736: variable 'omit' from source: magic vars 11389 1726854850.18856: variable 'omit' from source: magic vars 11389 1726854850.19116: variable '__ostree_booted_stat' from source: set_fact 11389 1726854850.19179: variable 'omit' from source: magic vars 11389 1726854850.19227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854850.19259: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854850.19294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854850.19323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854850.19348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854850.19393: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854850.19402: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854850.19409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854850.19504: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854850.19516: Set connection var ansible_timeout to 10 11389 1726854850.19521: Set connection var ansible_connection to ssh 11389 1726854850.19529: Set connection var ansible_shell_type to sh 11389 1726854850.19535: Set connection var ansible_pipelining to False 11389 1726854850.19542: Set connection var ansible_shell_executable to /bin/sh 11389 1726854850.19694: variable 'ansible_shell_executable' from source: unknown 11389 1726854850.19698: variable 'ansible_connection' from source: unknown 11389 1726854850.19700: variable 'ansible_module_compression' from source: unknown 11389 1726854850.19708: variable 'ansible_shell_type' from source: unknown 11389 1726854850.19711: variable 'ansible_shell_executable' from source: unknown 11389 1726854850.19713: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854850.19715: variable 'ansible_pipelining' from source: unknown 11389 1726854850.19717: variable 'ansible_timeout' from source: unknown 11389 1726854850.19720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854850.19792: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854850.19796: variable 'omit' from source: magic vars 11389 1726854850.19798: starting attempt loop 11389 1726854850.19801: running the handler 11389 1726854850.19803: handler run complete 11389 1726854850.19810: attempt loop complete, returning result 11389 1726854850.19831: _execute() done 11389 1726854850.19837: dumping result to json 11389 1726854850.19840: done dumping result, returning 11389 1726854850.19842: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [0affcc66-ac2b-deb8-c119-0000000000e0] 11389 1726854850.19844: sending task result for task 0affcc66-ac2b-deb8-c119-0000000000e0 11389 1726854850.20018: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000000e0 11389 1726854850.20021: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 11389 1726854850.20103: no more pending results, returning what we have 11389 1726854850.20106: results queue empty 11389 1726854850.20107: checking for any_errors_fatal 11389 1726854850.20113: done checking for any_errors_fatal 11389 1726854850.20113: checking for max_fail_percentage 11389 1726854850.20115: done checking for max_fail_percentage 11389 1726854850.20116: checking to see if all hosts have failed and the running result is not ok 11389 1726854850.20116: done checking to see if all hosts have failed 11389 1726854850.20117: getting the remaining hosts for this loop 11389 1726854850.20118: done getting the remaining hosts for this loop 11389 1726854850.20122: getting the next task for host managed_node3 11389 1726854850.20129: done getting next task for host managed_node3 11389 1726854850.20132: ^ task is: TASK: Fix CentOS6 Base repo 11389 1726854850.20134: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854850.20142: getting variables 11389 1726854850.20144: in VariableManager get_vars() 11389 1726854850.20172: Calling all_inventory to load vars for managed_node3 11389 1726854850.20174: Calling groups_inventory to load vars for managed_node3 11389 1726854850.20177: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854850.20186: Calling all_plugins_play to load vars for managed_node3 11389 1726854850.20190: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854850.20198: Calling groups_plugins_play to load vars for managed_node3 11389 1726854850.20445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854850.20654: done with get_vars() 11389 1726854850.20665: done getting variables 11389 1726854850.20802: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 13:54:10 -0400 (0:00:00.050) 0:00:02.631 ****** 11389 1726854850.20839: entering _queue_task() for managed_node3/copy 11389 1726854850.21364: worker is 1 (out of 1 available) 11389 1726854850.21493: exiting _queue_task() for managed_node3/copy 11389 1726854850.21503: done queuing things up, now waiting for results queue to drain 11389 1726854850.21520: waiting for pending results... 11389 1726854850.21806: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 11389 1726854850.22012: in run() - task 0affcc66-ac2b-deb8-c119-0000000000e2 11389 1726854850.22015: variable 'ansible_search_path' from source: unknown 11389 1726854850.22018: variable 'ansible_search_path' from source: unknown 11389 1726854850.22021: calling self._execute() 11389 1726854850.22285: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854850.22295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854850.22299: variable 'omit' from source: magic vars 11389 1726854850.22818: variable 'ansible_distribution' from source: facts 11389 1726854850.22843: Evaluated conditional (ansible_distribution == 'CentOS'): True 11389 1726854850.22958: variable 'ansible_distribution_major_version' from source: facts 11389 1726854850.22969: Evaluated conditional (ansible_distribution_major_version == '6'): False 11389 1726854850.23010: when evaluation is False, skipping this task 11389 1726854850.23014: _execute() done 11389 1726854850.23016: dumping result to json 11389 1726854850.23018: done dumping result, returning 11389 1726854850.23020: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [0affcc66-ac2b-deb8-c119-0000000000e2] 11389 1726854850.23022: sending task result for task 0affcc66-ac2b-deb8-c119-0000000000e2 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11389 1726854850.23286: no more pending results, returning what we have 11389 1726854850.23291: results queue empty 11389 1726854850.23291: checking for any_errors_fatal 11389 1726854850.23296: done checking for any_errors_fatal 11389 1726854850.23297: checking for max_fail_percentage 11389 1726854850.23299: done checking for max_fail_percentage 11389 1726854850.23300: checking to see if all hosts have failed and the running result is not ok 11389 1726854850.23301: done checking to see if all hosts have failed 11389 1726854850.23301: getting the remaining hosts for this loop 11389 1726854850.23303: done getting the remaining hosts for this loop 11389 1726854850.23306: getting the next task for host managed_node3 11389 1726854850.23313: done getting next task for host managed_node3 11389 1726854850.23316: ^ task is: TASK: Include the task 'enable_epel.yml' 11389 1726854850.23319: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854850.23324: getting variables 11389 1726854850.23326: in VariableManager get_vars() 11389 1726854850.23500: Calling all_inventory to load vars for managed_node3 11389 1726854850.23503: Calling groups_inventory to load vars for managed_node3 11389 1726854850.23507: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854850.23513: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000000e2 11389 1726854850.23516: WORKER PROCESS EXITING 11389 1726854850.23526: Calling all_plugins_play to load vars for managed_node3 11389 1726854850.23529: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854850.23532: Calling groups_plugins_play to load vars for managed_node3 11389 1726854850.23702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854850.23946: done with get_vars() 11389 1726854850.23956: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 13:54:10 -0400 (0:00:00.032) 0:00:02.663 ****** 11389 1726854850.24076: entering _queue_task() for managed_node3/include_tasks 11389 1726854850.24374: worker is 1 (out of 1 available) 11389 1726854850.24386: exiting _queue_task() for managed_node3/include_tasks 11389 1726854850.24411: done queuing things up, now waiting for results queue to drain 11389 1726854850.24413: waiting for pending results... 11389 1726854850.24594: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 11389 1726854850.24704: in run() - task 0affcc66-ac2b-deb8-c119-0000000000e3 11389 1726854850.24723: variable 'ansible_search_path' from source: unknown 11389 1726854850.24730: variable 'ansible_search_path' from source: unknown 11389 1726854850.24779: calling self._execute() 11389 1726854850.24850: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854850.24869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854850.24885: variable 'omit' from source: magic vars 11389 1726854850.25347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854850.27646: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854850.27728: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854850.27793: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854850.27818: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854850.27850: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854850.27944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854850.27993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854850.28012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854850.28070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854850.28134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854850.28225: variable '__network_is_ostree' from source: set_fact 11389 1726854850.28257: Evaluated conditional (not __network_is_ostree | d(false)): True 11389 1726854850.28267: _execute() done 11389 1726854850.28273: dumping result to json 11389 1726854850.28280: done dumping result, returning 11389 1726854850.28291: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [0affcc66-ac2b-deb8-c119-0000000000e3] 11389 1726854850.28347: sending task result for task 0affcc66-ac2b-deb8-c119-0000000000e3 11389 1726854850.28477: no more pending results, returning what we have 11389 1726854850.28483: in VariableManager get_vars() 11389 1726854850.28520: Calling all_inventory to load vars for managed_node3 11389 1726854850.28523: Calling groups_inventory to load vars for managed_node3 11389 1726854850.28527: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854850.28537: Calling all_plugins_play to load vars for managed_node3 11389 1726854850.28540: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854850.28543: Calling groups_plugins_play to load vars for managed_node3 11389 1726854850.28901: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000000e3 11389 1726854850.28904: WORKER PROCESS EXITING 11389 1726854850.28936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854850.29176: done with get_vars() 11389 1726854850.29185: variable 'ansible_search_path' from source: unknown 11389 1726854850.29186: variable 'ansible_search_path' from source: unknown 11389 1726854850.29225: we have included files to process 11389 1726854850.29226: generating all_blocks data 11389 1726854850.29228: done generating all_blocks data 11389 1726854850.29241: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11389 1726854850.29243: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11389 1726854850.29247: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11389 1726854850.30159: done processing included file 11389 1726854850.30162: iterating over new_blocks loaded from include file 11389 1726854850.30163: in VariableManager get_vars() 11389 1726854850.30174: done with get_vars() 11389 1726854850.30175: filtering new block on tags 11389 1726854850.30199: done filtering new block on tags 11389 1726854850.30202: in VariableManager get_vars() 11389 1726854850.30213: done with get_vars() 11389 1726854850.30214: filtering new block on tags 11389 1726854850.30227: done filtering new block on tags 11389 1726854850.30229: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 11389 1726854850.30234: extending task lists for all hosts with included blocks 11389 1726854850.30337: done extending task lists 11389 1726854850.30338: done processing included files 11389 1726854850.30339: results queue empty 11389 1726854850.30340: checking for any_errors_fatal 11389 1726854850.30343: done checking for any_errors_fatal 11389 1726854850.30344: checking for max_fail_percentage 11389 1726854850.30345: done checking for max_fail_percentage 11389 1726854850.30346: checking to see if all hosts have failed and the running result is not ok 11389 1726854850.30347: done checking to see if all hosts have failed 11389 1726854850.30348: getting the remaining hosts for this loop 11389 1726854850.30349: done getting the remaining hosts for this loop 11389 1726854850.30351: getting the next task for host managed_node3 11389 1726854850.30355: done getting next task for host managed_node3 11389 1726854850.30357: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 11389 1726854850.30359: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854850.30361: getting variables 11389 1726854850.30362: in VariableManager get_vars() 11389 1726854850.30370: Calling all_inventory to load vars for managed_node3 11389 1726854850.30372: Calling groups_inventory to load vars for managed_node3 11389 1726854850.30374: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854850.30379: Calling all_plugins_play to load vars for managed_node3 11389 1726854850.30386: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854850.30391: Calling groups_plugins_play to load vars for managed_node3 11389 1726854850.30548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854850.30747: done with get_vars() 11389 1726854850.30755: done getting variables 11389 1726854850.30819: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 11389 1726854850.31013: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 13:54:10 -0400 (0:00:00.069) 0:00:02.733 ****** 11389 1726854850.31068: entering _queue_task() for managed_node3/command 11389 1726854850.31070: Creating lock for command 11389 1726854850.31384: worker is 1 (out of 1 available) 11389 1726854850.31399: exiting _queue_task() for managed_node3/command 11389 1726854850.31411: done queuing things up, now waiting for results queue to drain 11389 1726854850.31412: waiting for pending results... 11389 1726854850.31630: running TaskExecutor() for managed_node3/TASK: Create EPEL 10 11389 1726854850.31793: in run() - task 0affcc66-ac2b-deb8-c119-0000000000fd 11389 1726854850.31797: variable 'ansible_search_path' from source: unknown 11389 1726854850.31800: variable 'ansible_search_path' from source: unknown 11389 1726854850.31825: calling self._execute() 11389 1726854850.31904: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854850.31924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854850.31940: variable 'omit' from source: magic vars 11389 1726854850.32328: variable 'ansible_distribution' from source: facts 11389 1726854850.32358: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11389 1726854850.32577: variable 'ansible_distribution_major_version' from source: facts 11389 1726854850.32580: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11389 1726854850.32582: when evaluation is False, skipping this task 11389 1726854850.32585: _execute() done 11389 1726854850.32589: dumping result to json 11389 1726854850.32591: done dumping result, returning 11389 1726854850.32594: done running TaskExecutor() for managed_node3/TASK: Create EPEL 10 [0affcc66-ac2b-deb8-c119-0000000000fd] 11389 1726854850.32596: sending task result for task 0affcc66-ac2b-deb8-c119-0000000000fd 11389 1726854850.32671: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000000fd 11389 1726854850.32675: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11389 1726854850.32936: no more pending results, returning what we have 11389 1726854850.32939: results queue empty 11389 1726854850.32940: checking for any_errors_fatal 11389 1726854850.32941: done checking for any_errors_fatal 11389 1726854850.32942: checking for max_fail_percentage 11389 1726854850.32943: done checking for max_fail_percentage 11389 1726854850.32944: checking to see if all hosts have failed and the running result is not ok 11389 1726854850.32945: done checking to see if all hosts have failed 11389 1726854850.32946: getting the remaining hosts for this loop 11389 1726854850.32947: done getting the remaining hosts for this loop 11389 1726854850.32950: getting the next task for host managed_node3 11389 1726854850.32956: done getting next task for host managed_node3 11389 1726854850.32958: ^ task is: TASK: Install yum-utils package 11389 1726854850.32961: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854850.32965: getting variables 11389 1726854850.32966: in VariableManager get_vars() 11389 1726854850.32996: Calling all_inventory to load vars for managed_node3 11389 1726854850.33000: Calling groups_inventory to load vars for managed_node3 11389 1726854850.33004: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854850.33014: Calling all_plugins_play to load vars for managed_node3 11389 1726854850.33017: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854850.33020: Calling groups_plugins_play to load vars for managed_node3 11389 1726854850.33195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854850.33430: done with get_vars() 11389 1726854850.33439: done getting variables 11389 1726854850.33533: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 13:54:10 -0400 (0:00:00.024) 0:00:02.758 ****** 11389 1726854850.33560: entering _queue_task() for managed_node3/package 11389 1726854850.33561: Creating lock for package 11389 1726854850.33816: worker is 1 (out of 1 available) 11389 1726854850.33828: exiting _queue_task() for managed_node3/package 11389 1726854850.33839: done queuing things up, now waiting for results queue to drain 11389 1726854850.33841: waiting for pending results... 11389 1726854850.34044: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 11389 1726854850.34159: in run() - task 0affcc66-ac2b-deb8-c119-0000000000fe 11389 1726854850.34181: variable 'ansible_search_path' from source: unknown 11389 1726854850.34261: variable 'ansible_search_path' from source: unknown 11389 1726854850.34265: calling self._execute() 11389 1726854850.34390: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854850.34402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854850.34417: variable 'omit' from source: magic vars 11389 1726854850.34852: variable 'ansible_distribution' from source: facts 11389 1726854850.34874: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11389 1726854850.35030: variable 'ansible_distribution_major_version' from source: facts 11389 1726854850.35048: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11389 1726854850.35057: when evaluation is False, skipping this task 11389 1726854850.35131: _execute() done 11389 1726854850.35134: dumping result to json 11389 1726854850.35137: done dumping result, returning 11389 1726854850.35140: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [0affcc66-ac2b-deb8-c119-0000000000fe] 11389 1726854850.35142: sending task result for task 0affcc66-ac2b-deb8-c119-0000000000fe skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11389 1726854850.35308: no more pending results, returning what we have 11389 1726854850.35311: results queue empty 11389 1726854850.35312: checking for any_errors_fatal 11389 1726854850.35317: done checking for any_errors_fatal 11389 1726854850.35317: checking for max_fail_percentage 11389 1726854850.35320: done checking for max_fail_percentage 11389 1726854850.35321: checking to see if all hosts have failed and the running result is not ok 11389 1726854850.35322: done checking to see if all hosts have failed 11389 1726854850.35322: getting the remaining hosts for this loop 11389 1726854850.35324: done getting the remaining hosts for this loop 11389 1726854850.35327: getting the next task for host managed_node3 11389 1726854850.35334: done getting next task for host managed_node3 11389 1726854850.35337: ^ task is: TASK: Enable EPEL 7 11389 1726854850.35340: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854850.35343: getting variables 11389 1726854850.35345: in VariableManager get_vars() 11389 1726854850.35371: Calling all_inventory to load vars for managed_node3 11389 1726854850.35376: Calling groups_inventory to load vars for managed_node3 11389 1726854850.35379: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854850.35610: Calling all_plugins_play to load vars for managed_node3 11389 1726854850.35614: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854850.35618: Calling groups_plugins_play to load vars for managed_node3 11389 1726854850.35771: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000000fe 11389 1726854850.35775: WORKER PROCESS EXITING 11389 1726854850.35823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854850.36028: done with get_vars() 11389 1726854850.36037: done getting variables 11389 1726854850.36094: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 13:54:10 -0400 (0:00:00.025) 0:00:02.784 ****** 11389 1726854850.36123: entering _queue_task() for managed_node3/command 11389 1726854850.36374: worker is 1 (out of 1 available) 11389 1726854850.36390: exiting _queue_task() for managed_node3/command 11389 1726854850.36402: done queuing things up, now waiting for results queue to drain 11389 1726854850.36403: waiting for pending results... 11389 1726854850.36616: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 11389 1726854850.36740: in run() - task 0affcc66-ac2b-deb8-c119-0000000000ff 11389 1726854850.36762: variable 'ansible_search_path' from source: unknown 11389 1726854850.36773: variable 'ansible_search_path' from source: unknown 11389 1726854850.36820: calling self._execute() 11389 1726854850.36993: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854850.36998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854850.37000: variable 'omit' from source: magic vars 11389 1726854850.37357: variable 'ansible_distribution' from source: facts 11389 1726854850.37378: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11389 1726854850.37520: variable 'ansible_distribution_major_version' from source: facts 11389 1726854850.37531: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11389 1726854850.37539: when evaluation is False, skipping this task 11389 1726854850.37547: _execute() done 11389 1726854850.37793: dumping result to json 11389 1726854850.37797: done dumping result, returning 11389 1726854850.37800: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [0affcc66-ac2b-deb8-c119-0000000000ff] 11389 1726854850.37803: sending task result for task 0affcc66-ac2b-deb8-c119-0000000000ff 11389 1726854850.37866: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000000ff 11389 1726854850.37872: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11389 1726854850.37915: no more pending results, returning what we have 11389 1726854850.37918: results queue empty 11389 1726854850.37919: checking for any_errors_fatal 11389 1726854850.37923: done checking for any_errors_fatal 11389 1726854850.37924: checking for max_fail_percentage 11389 1726854850.37926: done checking for max_fail_percentage 11389 1726854850.37927: checking to see if all hosts have failed and the running result is not ok 11389 1726854850.37928: done checking to see if all hosts have failed 11389 1726854850.37928: getting the remaining hosts for this loop 11389 1726854850.37930: done getting the remaining hosts for this loop 11389 1726854850.37933: getting the next task for host managed_node3 11389 1726854850.37938: done getting next task for host managed_node3 11389 1726854850.37941: ^ task is: TASK: Enable EPEL 8 11389 1726854850.37944: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854850.37948: getting variables 11389 1726854850.37949: in VariableManager get_vars() 11389 1726854850.37978: Calling all_inventory to load vars for managed_node3 11389 1726854850.37980: Calling groups_inventory to load vars for managed_node3 11389 1726854850.37986: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854850.37997: Calling all_plugins_play to load vars for managed_node3 11389 1726854850.38000: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854850.38004: Calling groups_plugins_play to load vars for managed_node3 11389 1726854850.38241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854850.38477: done with get_vars() 11389 1726854850.38489: done getting variables 11389 1726854850.38552: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 13:54:10 -0400 (0:00:00.024) 0:00:02.808 ****** 11389 1726854850.38585: entering _queue_task() for managed_node3/command 11389 1726854850.38844: worker is 1 (out of 1 available) 11389 1726854850.38860: exiting _queue_task() for managed_node3/command 11389 1726854850.38874: done queuing things up, now waiting for results queue to drain 11389 1726854850.38876: waiting for pending results... 11389 1726854850.39091: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 11389 1726854850.39297: in run() - task 0affcc66-ac2b-deb8-c119-000000000100 11389 1726854850.39301: variable 'ansible_search_path' from source: unknown 11389 1726854850.39303: variable 'ansible_search_path' from source: unknown 11389 1726854850.39305: calling self._execute() 11389 1726854850.39320: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854850.39330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854850.39340: variable 'omit' from source: magic vars 11389 1726854850.39710: variable 'ansible_distribution' from source: facts 11389 1726854850.39736: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11389 1726854850.39875: variable 'ansible_distribution_major_version' from source: facts 11389 1726854850.39886: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11389 1726854850.39896: when evaluation is False, skipping this task 11389 1726854850.39905: _execute() done 11389 1726854850.39912: dumping result to json 11389 1726854850.39919: done dumping result, returning 11389 1726854850.39937: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [0affcc66-ac2b-deb8-c119-000000000100] 11389 1726854850.39957: sending task result for task 0affcc66-ac2b-deb8-c119-000000000100 11389 1726854850.40178: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000100 11389 1726854850.40182: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11389 1726854850.40228: no more pending results, returning what we have 11389 1726854850.40231: results queue empty 11389 1726854850.40232: checking for any_errors_fatal 11389 1726854850.40236: done checking for any_errors_fatal 11389 1726854850.40236: checking for max_fail_percentage 11389 1726854850.40238: done checking for max_fail_percentage 11389 1726854850.40239: checking to see if all hosts have failed and the running result is not ok 11389 1726854850.40240: done checking to see if all hosts have failed 11389 1726854850.40240: getting the remaining hosts for this loop 11389 1726854850.40241: done getting the remaining hosts for this loop 11389 1726854850.40244: getting the next task for host managed_node3 11389 1726854850.40251: done getting next task for host managed_node3 11389 1726854850.40253: ^ task is: TASK: Enable EPEL 6 11389 1726854850.40258: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854850.40260: getting variables 11389 1726854850.40262: in VariableManager get_vars() 11389 1726854850.40294: Calling all_inventory to load vars for managed_node3 11389 1726854850.40297: Calling groups_inventory to load vars for managed_node3 11389 1726854850.40299: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854850.40308: Calling all_plugins_play to load vars for managed_node3 11389 1726854850.40310: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854850.40313: Calling groups_plugins_play to load vars for managed_node3 11389 1726854850.40489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854850.40686: done with get_vars() 11389 1726854850.40697: done getting variables 11389 1726854850.40757: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 13:54:10 -0400 (0:00:00.022) 0:00:02.830 ****** 11389 1726854850.40790: entering _queue_task() for managed_node3/copy 11389 1726854850.41144: worker is 1 (out of 1 available) 11389 1726854850.41154: exiting _queue_task() for managed_node3/copy 11389 1726854850.41164: done queuing things up, now waiting for results queue to drain 11389 1726854850.41166: waiting for pending results... 11389 1726854850.41373: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 11389 1726854850.41434: in run() - task 0affcc66-ac2b-deb8-c119-000000000102 11389 1726854850.41450: variable 'ansible_search_path' from source: unknown 11389 1726854850.41457: variable 'ansible_search_path' from source: unknown 11389 1726854850.41507: calling self._execute() 11389 1726854850.41586: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854850.41609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854850.41689: variable 'omit' from source: magic vars 11389 1726854850.42003: variable 'ansible_distribution' from source: facts 11389 1726854850.42026: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11389 1726854850.42151: variable 'ansible_distribution_major_version' from source: facts 11389 1726854850.42154: Evaluated conditional (ansible_distribution_major_version == '6'): False 11389 1726854850.42231: when evaluation is False, skipping this task 11389 1726854850.42234: _execute() done 11389 1726854850.42237: dumping result to json 11389 1726854850.42239: done dumping result, returning 11389 1726854850.42242: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [0affcc66-ac2b-deb8-c119-000000000102] 11389 1726854850.42244: sending task result for task 0affcc66-ac2b-deb8-c119-000000000102 11389 1726854850.42325: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000102 11389 1726854850.42328: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11389 1726854850.42386: no more pending results, returning what we have 11389 1726854850.42392: results queue empty 11389 1726854850.42393: checking for any_errors_fatal 11389 1726854850.42397: done checking for any_errors_fatal 11389 1726854850.42398: checking for max_fail_percentage 11389 1726854850.42400: done checking for max_fail_percentage 11389 1726854850.42402: checking to see if all hosts have failed and the running result is not ok 11389 1726854850.42403: done checking to see if all hosts have failed 11389 1726854850.42403: getting the remaining hosts for this loop 11389 1726854850.42405: done getting the remaining hosts for this loop 11389 1726854850.42408: getting the next task for host managed_node3 11389 1726854850.42418: done getting next task for host managed_node3 11389 1726854850.42421: ^ task is: TASK: Set network provider to 'nm' 11389 1726854850.42424: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854850.42428: getting variables 11389 1726854850.42430: in VariableManager get_vars() 11389 1726854850.42465: Calling all_inventory to load vars for managed_node3 11389 1726854850.42472: Calling groups_inventory to load vars for managed_node3 11389 1726854850.42476: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854850.42704: Calling all_plugins_play to load vars for managed_node3 11389 1726854850.42708: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854850.42712: Calling groups_plugins_play to load vars for managed_node3 11389 1726854850.43327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854850.43528: done with get_vars() 11389 1726854850.43538: done getting variables 11389 1726854850.43611: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:13 Friday 20 September 2024 13:54:10 -0400 (0:00:00.028) 0:00:02.859 ****** 11389 1726854850.43638: entering _queue_task() for managed_node3/set_fact 11389 1726854850.44033: worker is 1 (out of 1 available) 11389 1726854850.44044: exiting _queue_task() for managed_node3/set_fact 11389 1726854850.44053: done queuing things up, now waiting for results queue to drain 11389 1726854850.44055: waiting for pending results... 11389 1726854850.44296: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 11389 1726854850.44339: in run() - task 0affcc66-ac2b-deb8-c119-000000000007 11389 1726854850.44394: variable 'ansible_search_path' from source: unknown 11389 1726854850.44409: calling self._execute() 11389 1726854850.44499: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854850.44556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854850.44560: variable 'omit' from source: magic vars 11389 1726854850.44648: variable 'omit' from source: magic vars 11389 1726854850.44693: variable 'omit' from source: magic vars 11389 1726854850.44741: variable 'omit' from source: magic vars 11389 1726854850.44795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854850.44843: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854850.44883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854850.44899: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854850.44941: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854850.44959: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854850.44972: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854850.45101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854850.45104: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854850.45117: Set connection var ansible_timeout to 10 11389 1726854850.45126: Set connection var ansible_connection to ssh 11389 1726854850.45138: Set connection var ansible_shell_type to sh 11389 1726854850.45148: Set connection var ansible_pipelining to False 11389 1726854850.45159: Set connection var ansible_shell_executable to /bin/sh 11389 1726854850.45191: variable 'ansible_shell_executable' from source: unknown 11389 1726854850.45201: variable 'ansible_connection' from source: unknown 11389 1726854850.45219: variable 'ansible_module_compression' from source: unknown 11389 1726854850.45227: variable 'ansible_shell_type' from source: unknown 11389 1726854850.45234: variable 'ansible_shell_executable' from source: unknown 11389 1726854850.45241: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854850.45249: variable 'ansible_pipelining' from source: unknown 11389 1726854850.45255: variable 'ansible_timeout' from source: unknown 11389 1726854850.45262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854850.45431: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854850.45450: variable 'omit' from source: magic vars 11389 1726854850.45534: starting attempt loop 11389 1726854850.45538: running the handler 11389 1726854850.45542: handler run complete 11389 1726854850.45545: attempt loop complete, returning result 11389 1726854850.45547: _execute() done 11389 1726854850.45549: dumping result to json 11389 1726854850.45551: done dumping result, returning 11389 1726854850.45553: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [0affcc66-ac2b-deb8-c119-000000000007] 11389 1726854850.45555: sending task result for task 0affcc66-ac2b-deb8-c119-000000000007 11389 1726854850.45714: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000007 11389 1726854850.45717: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 11389 1726854850.45778: no more pending results, returning what we have 11389 1726854850.45781: results queue empty 11389 1726854850.45783: checking for any_errors_fatal 11389 1726854850.45791: done checking for any_errors_fatal 11389 1726854850.45792: checking for max_fail_percentage 11389 1726854850.45794: done checking for max_fail_percentage 11389 1726854850.45795: checking to see if all hosts have failed and the running result is not ok 11389 1726854850.45796: done checking to see if all hosts have failed 11389 1726854850.45797: getting the remaining hosts for this loop 11389 1726854850.45798: done getting the remaining hosts for this loop 11389 1726854850.45802: getting the next task for host managed_node3 11389 1726854850.45809: done getting next task for host managed_node3 11389 1726854850.45811: ^ task is: TASK: meta (flush_handlers) 11389 1726854850.45813: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854850.45817: getting variables 11389 1726854850.45819: in VariableManager get_vars() 11389 1726854850.45850: Calling all_inventory to load vars for managed_node3 11389 1726854850.45852: Calling groups_inventory to load vars for managed_node3 11389 1726854850.45857: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854850.45871: Calling all_plugins_play to load vars for managed_node3 11389 1726854850.45875: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854850.45878: Calling groups_plugins_play to load vars for managed_node3 11389 1726854850.46285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854850.46483: done with get_vars() 11389 1726854850.46494: done getting variables 11389 1726854850.46565: in VariableManager get_vars() 11389 1726854850.46578: Calling all_inventory to load vars for managed_node3 11389 1726854850.46581: Calling groups_inventory to load vars for managed_node3 11389 1726854850.46583: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854850.46590: Calling all_plugins_play to load vars for managed_node3 11389 1726854850.46593: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854850.46596: Calling groups_plugins_play to load vars for managed_node3 11389 1726854850.46734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854850.46957: done with get_vars() 11389 1726854850.46972: done queuing things up, now waiting for results queue to drain 11389 1726854850.46974: results queue empty 11389 1726854850.46974: checking for any_errors_fatal 11389 1726854850.46976: done checking for any_errors_fatal 11389 1726854850.46981: checking for max_fail_percentage 11389 1726854850.46982: done checking for max_fail_percentage 11389 1726854850.46983: checking to see if all hosts have failed and the running result is not ok 11389 1726854850.46983: done checking to see if all hosts have failed 11389 1726854850.46984: getting the remaining hosts for this loop 11389 1726854850.46985: done getting the remaining hosts for this loop 11389 1726854850.46989: getting the next task for host managed_node3 11389 1726854850.46993: done getting next task for host managed_node3 11389 1726854850.46995: ^ task is: TASK: meta (flush_handlers) 11389 1726854850.46996: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854850.47005: getting variables 11389 1726854850.47006: in VariableManager get_vars() 11389 1726854850.47014: Calling all_inventory to load vars for managed_node3 11389 1726854850.47016: Calling groups_inventory to load vars for managed_node3 11389 1726854850.47019: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854850.47023: Calling all_plugins_play to load vars for managed_node3 11389 1726854850.47025: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854850.47028: Calling groups_plugins_play to load vars for managed_node3 11389 1726854850.47170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854850.47362: done with get_vars() 11389 1726854850.47373: done getting variables 11389 1726854850.47430: in VariableManager get_vars() 11389 1726854850.47438: Calling all_inventory to load vars for managed_node3 11389 1726854850.47440: Calling groups_inventory to load vars for managed_node3 11389 1726854850.47442: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854850.47446: Calling all_plugins_play to load vars for managed_node3 11389 1726854850.47448: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854850.47450: Calling groups_plugins_play to load vars for managed_node3 11389 1726854850.47593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854850.47801: done with get_vars() 11389 1726854850.47813: done queuing things up, now waiting for results queue to drain 11389 1726854850.47815: results queue empty 11389 1726854850.47816: checking for any_errors_fatal 11389 1726854850.47817: done checking for any_errors_fatal 11389 1726854850.47818: checking for max_fail_percentage 11389 1726854850.47819: done checking for max_fail_percentage 11389 1726854850.47819: checking to see if all hosts have failed and the running result is not ok 11389 1726854850.47820: done checking to see if all hosts have failed 11389 1726854850.47821: getting the remaining hosts for this loop 11389 1726854850.47821: done getting the remaining hosts for this loop 11389 1726854850.47824: getting the next task for host managed_node3 11389 1726854850.47826: done getting next task for host managed_node3 11389 1726854850.47827: ^ task is: None 11389 1726854850.47828: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854850.47829: done queuing things up, now waiting for results queue to drain 11389 1726854850.47830: results queue empty 11389 1726854850.47831: checking for any_errors_fatal 11389 1726854850.47831: done checking for any_errors_fatal 11389 1726854850.47832: checking for max_fail_percentage 11389 1726854850.47833: done checking for max_fail_percentage 11389 1726854850.47833: checking to see if all hosts have failed and the running result is not ok 11389 1726854850.47834: done checking to see if all hosts have failed 11389 1726854850.47835: getting the next task for host managed_node3 11389 1726854850.47837: done getting next task for host managed_node3 11389 1726854850.47838: ^ task is: None 11389 1726854850.47839: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854850.47890: in VariableManager get_vars() 11389 1726854850.47914: done with get_vars() 11389 1726854850.47921: in VariableManager get_vars() 11389 1726854850.47936: done with get_vars() 11389 1726854850.47942: variable 'omit' from source: magic vars 11389 1726854850.47990: in VariableManager get_vars() 11389 1726854850.48007: done with get_vars() 11389 1726854850.48030: variable 'omit' from source: magic vars PLAY [Play for testing bond connection] **************************************** 11389 1726854850.48764: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 11389 1726854850.48794: getting the remaining hosts for this loop 11389 1726854850.48796: done getting the remaining hosts for this loop 11389 1726854850.48799: getting the next task for host managed_node3 11389 1726854850.48802: done getting next task for host managed_node3 11389 1726854850.48804: ^ task is: TASK: Gathering Facts 11389 1726854850.48805: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854850.48807: getting variables 11389 1726854850.48808: in VariableManager get_vars() 11389 1726854850.48820: Calling all_inventory to load vars for managed_node3 11389 1726854850.48826: Calling groups_inventory to load vars for managed_node3 11389 1726854850.48829: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854850.48834: Calling all_plugins_play to load vars for managed_node3 11389 1726854850.48847: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854850.48851: Calling groups_plugins_play to load vars for managed_node3 11389 1726854850.49002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854850.49199: done with get_vars() 11389 1726854850.49207: done getting variables 11389 1726854850.49243: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 Friday 20 September 2024 13:54:10 -0400 (0:00:00.058) 0:00:02.918 ****** 11389 1726854850.49500: entering _queue_task() for managed_node3/gather_facts 11389 1726854850.49853: worker is 1 (out of 1 available) 11389 1726854850.49864: exiting _queue_task() for managed_node3/gather_facts 11389 1726854850.49875: done queuing things up, now waiting for results queue to drain 11389 1726854850.49877: waiting for pending results... 11389 1726854850.50346: running TaskExecutor() for managed_node3/TASK: Gathering Facts 11389 1726854850.50690: in run() - task 0affcc66-ac2b-deb8-c119-000000000128 11389 1726854850.50699: variable 'ansible_search_path' from source: unknown 11389 1726854850.50918: calling self._execute() 11389 1726854850.51123: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854850.51127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854850.51135: variable 'omit' from source: magic vars 11389 1726854850.51734: variable 'ansible_distribution_major_version' from source: facts 11389 1726854850.51751: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854850.51761: variable 'omit' from source: magic vars 11389 1726854850.51805: variable 'omit' from source: magic vars 11389 1726854850.51848: variable 'omit' from source: magic vars 11389 1726854850.51903: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854850.51939: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854850.51994: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854850.51997: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854850.52003: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854850.52039: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854850.52049: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854850.52057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854850.52194: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854850.52201: Set connection var ansible_timeout to 10 11389 1726854850.52295: Set connection var ansible_connection to ssh 11389 1726854850.52299: Set connection var ansible_shell_type to sh 11389 1726854850.52302: Set connection var ansible_pipelining to False 11389 1726854850.52306: Set connection var ansible_shell_executable to /bin/sh 11389 1726854850.52308: variable 'ansible_shell_executable' from source: unknown 11389 1726854850.52311: variable 'ansible_connection' from source: unknown 11389 1726854850.52320: variable 'ansible_module_compression' from source: unknown 11389 1726854850.52323: variable 'ansible_shell_type' from source: unknown 11389 1726854850.52325: variable 'ansible_shell_executable' from source: unknown 11389 1726854850.52327: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854850.52329: variable 'ansible_pipelining' from source: unknown 11389 1726854850.52331: variable 'ansible_timeout' from source: unknown 11389 1726854850.52333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854850.52515: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854850.52538: variable 'omit' from source: magic vars 11389 1726854850.52549: starting attempt loop 11389 1726854850.52556: running the handler 11389 1726854850.52580: variable 'ansible_facts' from source: unknown 11389 1726854850.52608: _low_level_execute_command(): starting 11389 1726854850.52622: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854850.53425: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854850.53544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854850.53606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854850.55295: stdout chunk (state=3): >>>/root <<< 11389 1726854850.55693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854850.55696: stdout chunk (state=3): >>><<< 11389 1726854850.55698: stderr chunk (state=3): >>><<< 11389 1726854850.55703: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854850.55705: _low_level_execute_command(): starting 11389 1726854850.55708: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854850.5568333-11600-122686648695136 `" && echo ansible-tmp-1726854850.5568333-11600-122686648695136="` echo /root/.ansible/tmp/ansible-tmp-1726854850.5568333-11600-122686648695136 `" ) && sleep 0' 11389 1726854850.56327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854850.56339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854850.56350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854850.56363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854850.56379: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854850.56470: stderr chunk (state=3): >>>debug2: match not found <<< 11389 1726854850.56478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854850.56490: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11389 1726854850.56493: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 11389 1726854850.56496: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11389 1726854850.56498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854850.56500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854850.56502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854850.56504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854850.56506: stderr chunk (state=3): >>>debug2: match found <<< 11389 1726854850.56508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854850.56592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854850.56623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854850.56634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854850.56714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854850.58627: stdout chunk (state=3): >>>ansible-tmp-1726854850.5568333-11600-122686648695136=/root/.ansible/tmp/ansible-tmp-1726854850.5568333-11600-122686648695136 <<< 11389 1726854850.58777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854850.58809: stdout chunk (state=3): >>><<< 11389 1726854850.58821: stderr chunk (state=3): >>><<< 11389 1726854850.58993: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854850.5568333-11600-122686648695136=/root/.ansible/tmp/ansible-tmp-1726854850.5568333-11600-122686648695136 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854850.58996: variable 'ansible_module_compression' from source: unknown 11389 1726854850.58999: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11389 1726854850.59001: variable 'ansible_facts' from source: unknown 11389 1726854850.59261: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854850.5568333-11600-122686648695136/AnsiballZ_setup.py 11389 1726854850.59476: Sending initial data 11389 1726854850.59572: Sent initial data (154 bytes) 11389 1726854850.60506: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854850.60536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854850.60553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854850.60574: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854850.60656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854850.62330: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854850.62418: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854850.62473: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmpl_5rpomk /root/.ansible/tmp/ansible-tmp-1726854850.5568333-11600-122686648695136/AnsiballZ_setup.py <<< 11389 1726854850.62483: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854850.5568333-11600-122686648695136/AnsiballZ_setup.py" <<< 11389 1726854850.62525: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmpl_5rpomk" to remote "/root/.ansible/tmp/ansible-tmp-1726854850.5568333-11600-122686648695136/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854850.5568333-11600-122686648695136/AnsiballZ_setup.py" <<< 11389 1726854850.64395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854850.64399: stdout chunk (state=3): >>><<< 11389 1726854850.64401: stderr chunk (state=3): >>><<< 11389 1726854850.64403: done transferring module to remote 11389 1726854850.64405: _low_level_execute_command(): starting 11389 1726854850.64407: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854850.5568333-11600-122686648695136/ /root/.ansible/tmp/ansible-tmp-1726854850.5568333-11600-122686648695136/AnsiballZ_setup.py && sleep 0' 11389 1726854850.64956: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854850.64974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854850.64991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854850.65011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854850.65027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854850.65038: stderr chunk (state=3): >>>debug2: match not found <<< 11389 1726854850.65139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854850.65163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854850.65244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854850.67086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854850.67107: stdout chunk (state=3): >>><<< 11389 1726854850.67123: stderr chunk (state=3): >>><<< 11389 1726854850.67230: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854850.67233: _low_level_execute_command(): starting 11389 1726854850.67236: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854850.5568333-11600-122686648695136/AnsiballZ_setup.py && sleep 0' 11389 1726854850.67777: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854850.67792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854850.67805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854850.67831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854850.67924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854850.67953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854850.67968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854850.68075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854851.30529: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-244", "ansible_nodename": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2bc2acdd478a7423346e83b59fcdca", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCehZcRIiuho2g2VlWk6lYiySbVbpZPmaChpZJeIDeRDUTa1RCEnhGqH+DqSOr9XQgt/gHETb6HW1jwsrG3TM2y4UJqdcp3Vzdn2ceWAQSdC2hYxEDR7vD44mLY2TejKxXaN9WKywAwIRXdqXE3GJHR51KQe4kLYkzvhwwLpGlQwdZ5Tr4DTu6gsb5lUwzcvzk7gErzO/v2T4+jlj/bt7UDFkiASBXxhi+oZQYQAxOwOgM1BAGpl8GWX5nd5MFlFvztq2uV8Mra3ANc/7CgBxQOT9iCGpBsUXJ9UG35hNjY0xC5qa1XCoQbp0sbNhS4C+uvHspFzAqFLBx69tc5dbYXanTxy+MCFe9g8WNJpNFK9UNYAWwDdUIfkDHf3HqZtqnMO8FBVbSS6+K2NOdt9ZrQP4d9jVZxS7o3E37g6YSmVV+6OJZ8oTiSVe1wx3uYYpFCPbdbdOGkXpvU0oaDInwYl5PzhM1yjiWMsSqSvYHkCUAzsAv0Ws/L0t5uXSgTbCU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAAcVe0oQCsdWka9CinqxODLfzoA5WUkIscuWGu+0Pb9loUC4MBgDClPe5T0oztCcT0NSKld23Y2UFOZyAkaU+U=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICjU01xmt/yoMRnNQ5IgfXwC8CabJN267FXBGFtFm2PC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:88:11:da:7f:a3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.244", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1088:11ff:feda:7fa3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.244", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:88:11:da:7f:a3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.244"], "ansible_all_ipv6_addresses": ["fe80::1088:11ff:feda:7fa3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.244", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1088:11ff:feda:7fa3"]}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2998, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 533, "free": 2998}, "nocache": {"free": 3312, "used": 219}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2bc2ac-dd47-8a74-2334-6e83b59fcdca", "ansible_product_uuid": "ec2bc2ac-dd47-8a74-2334-6e83b59fcdca", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 622, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805641728, "block_size": 4096, "block_total": 65519099, "block_available": 63917393, "block_used": 1601706, "inode_total": 131070960, "inode_available": 131029139, "inode_used": 41821, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 52416 10.31.9.244 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 52416 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "54", "second": "11", "epoch": "1726854851", "epoch_int": "1726854851", "date": "2024-09-20", "time": "13:54:11", "iso8601_micro": "2024-09-20T17:54:11.301608Z", "iso8601": "2024-09-20T17:54:11Z", "iso8601_basic": "20240920T135411301608", "iso8601_basic_short": "20240920T135411", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_loadavg": {"1m": 0.490234375, "5m": 0.2216796875, "15m": 0.1181640625}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11389 1726854851.32703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854851.32708: stdout chunk (state=3): >>><<< 11389 1726854851.32710: stderr chunk (state=3): >>><<< 11389 1726854851.32714: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-244", "ansible_nodename": "ip-10-31-9-244.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2bc2acdd478a7423346e83b59fcdca", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCehZcRIiuho2g2VlWk6lYiySbVbpZPmaChpZJeIDeRDUTa1RCEnhGqH+DqSOr9XQgt/gHETb6HW1jwsrG3TM2y4UJqdcp3Vzdn2ceWAQSdC2hYxEDR7vD44mLY2TejKxXaN9WKywAwIRXdqXE3GJHR51KQe4kLYkzvhwwLpGlQwdZ5Tr4DTu6gsb5lUwzcvzk7gErzO/v2T4+jlj/bt7UDFkiASBXxhi+oZQYQAxOwOgM1BAGpl8GWX5nd5MFlFvztq2uV8Mra3ANc/7CgBxQOT9iCGpBsUXJ9UG35hNjY0xC5qa1XCoQbp0sbNhS4C+uvHspFzAqFLBx69tc5dbYXanTxy+MCFe9g8WNJpNFK9UNYAWwDdUIfkDHf3HqZtqnMO8FBVbSS6+K2NOdt9ZrQP4d9jVZxS7o3E37g6YSmVV+6OJZ8oTiSVe1wx3uYYpFCPbdbdOGkXpvU0oaDInwYl5PzhM1yjiWMsSqSvYHkCUAzsAv0Ws/L0t5uXSgTbCU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAAcVe0oQCsdWka9CinqxODLfzoA5WUkIscuWGu+0Pb9loUC4MBgDClPe5T0oztCcT0NSKld23Y2UFOZyAkaU+U=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAICjU01xmt/yoMRnNQ5IgfXwC8CabJN267FXBGFtFm2PC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:88:11:da:7f:a3", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.244", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1088:11ff:feda:7fa3", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.244", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:88:11:da:7f:a3", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.244"], "ansible_all_ipv6_addresses": ["fe80::1088:11ff:feda:7fa3"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.244", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1088:11ff:feda:7fa3"]}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2998, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 533, "free": 2998}, "nocache": {"free": 3312, "used": 219}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2bc2ac-dd47-8a74-2334-6e83b59fcdca", "ansible_product_uuid": "ec2bc2ac-dd47-8a74-2334-6e83b59fcdca", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 622, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805641728, "block_size": 4096, "block_total": 65519099, "block_available": 63917393, "block_used": 1601706, "inode_total": 131070960, "inode_available": 131029139, "inode_used": 41821, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 52416 10.31.9.244 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 52416 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "54", "second": "11", "epoch": "1726854851", "epoch_int": "1726854851", "date": "2024-09-20", "time": "13:54:11", "iso8601_micro": "2024-09-20T17:54:11.301608Z", "iso8601": "2024-09-20T17:54:11Z", "iso8601_basic": "20240920T135411301608", "iso8601_basic_short": "20240920T135411", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_loadavg": {"1m": 0.490234375, "5m": 0.2216796875, "15m": 0.1181640625}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854851.33007: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854850.5568333-11600-122686648695136/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854851.33051: _low_level_execute_command(): starting 11389 1726854851.33064: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854850.5568333-11600-122686648695136/ > /dev/null 2>&1 && sleep 0' 11389 1726854851.33742: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854851.33797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854851.33866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854851.33884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854851.33928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854851.34174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854851.35959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854851.35974: stdout chunk (state=3): >>><<< 11389 1726854851.36005: stderr chunk (state=3): >>><<< 11389 1726854851.36027: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854851.36041: handler run complete 11389 1726854851.36193: variable 'ansible_facts' from source: unknown 11389 1726854851.36302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854851.36650: variable 'ansible_facts' from source: unknown 11389 1726854851.36740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854851.36878: attempt loop complete, returning result 11389 1726854851.36886: _execute() done 11389 1726854851.36896: dumping result to json 11389 1726854851.36930: done dumping result, returning 11389 1726854851.36942: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0affcc66-ac2b-deb8-c119-000000000128] 11389 1726854851.36953: sending task result for task 0affcc66-ac2b-deb8-c119-000000000128 11389 1726854851.37563: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000128 11389 1726854851.37566: WORKER PROCESS EXITING ok: [managed_node3] 11389 1726854851.37833: no more pending results, returning what we have 11389 1726854851.37836: results queue empty 11389 1726854851.37837: checking for any_errors_fatal 11389 1726854851.37838: done checking for any_errors_fatal 11389 1726854851.37838: checking for max_fail_percentage 11389 1726854851.37840: done checking for max_fail_percentage 11389 1726854851.37841: checking to see if all hosts have failed and the running result is not ok 11389 1726854851.37842: done checking to see if all hosts have failed 11389 1726854851.37842: getting the remaining hosts for this loop 11389 1726854851.37844: done getting the remaining hosts for this loop 11389 1726854851.37847: getting the next task for host managed_node3 11389 1726854851.37852: done getting next task for host managed_node3 11389 1726854851.37853: ^ task is: TASK: meta (flush_handlers) 11389 1726854851.37856: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854851.37860: getting variables 11389 1726854851.37861: in VariableManager get_vars() 11389 1726854851.37993: Calling all_inventory to load vars for managed_node3 11389 1726854851.37996: Calling groups_inventory to load vars for managed_node3 11389 1726854851.37998: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854851.38018: Calling all_plugins_play to load vars for managed_node3 11389 1726854851.38021: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854851.38025: Calling groups_plugins_play to load vars for managed_node3 11389 1726854851.38393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854851.38590: done with get_vars() 11389 1726854851.38609: done getting variables 11389 1726854851.38686: in VariableManager get_vars() 11389 1726854851.38704: Calling all_inventory to load vars for managed_node3 11389 1726854851.38707: Calling groups_inventory to load vars for managed_node3 11389 1726854851.38709: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854851.38713: Calling all_plugins_play to load vars for managed_node3 11389 1726854851.38715: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854851.38718: Calling groups_plugins_play to load vars for managed_node3 11389 1726854851.38862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854851.39057: done with get_vars() 11389 1726854851.39071: done queuing things up, now waiting for results queue to drain 11389 1726854851.39073: results queue empty 11389 1726854851.39073: checking for any_errors_fatal 11389 1726854851.39076: done checking for any_errors_fatal 11389 1726854851.39077: checking for max_fail_percentage 11389 1726854851.39078: done checking for max_fail_percentage 11389 1726854851.39078: checking to see if all hosts have failed and the running result is not ok 11389 1726854851.39083: done checking to see if all hosts have failed 11389 1726854851.39083: getting the remaining hosts for this loop 11389 1726854851.39084: done getting the remaining hosts for this loop 11389 1726854851.39088: getting the next task for host managed_node3 11389 1726854851.39092: done getting next task for host managed_node3 11389 1726854851.39094: ^ task is: TASK: INIT Prepare setup 11389 1726854851.39095: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854851.39103: getting variables 11389 1726854851.39104: in VariableManager get_vars() 11389 1726854851.39116: Calling all_inventory to load vars for managed_node3 11389 1726854851.39118: Calling groups_inventory to load vars for managed_node3 11389 1726854851.39120: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854851.39124: Calling all_plugins_play to load vars for managed_node3 11389 1726854851.39126: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854851.39128: Calling groups_plugins_play to load vars for managed_node3 11389 1726854851.39273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854851.39510: done with get_vars() 11389 1726854851.39518: done getting variables 11389 1726854851.39605: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:15 Friday 20 September 2024 13:54:11 -0400 (0:00:00.901) 0:00:03.819 ****** 11389 1726854851.39632: entering _queue_task() for managed_node3/debug 11389 1726854851.39633: Creating lock for debug 11389 1726854851.39965: worker is 1 (out of 1 available) 11389 1726854851.39993: exiting _queue_task() for managed_node3/debug 11389 1726854851.40005: done queuing things up, now waiting for results queue to drain 11389 1726854851.40007: waiting for pending results... 11389 1726854851.40514: running TaskExecutor() for managed_node3/TASK: INIT Prepare setup 11389 1726854851.40519: in run() - task 0affcc66-ac2b-deb8-c119-00000000000b 11389 1726854851.40527: variable 'ansible_search_path' from source: unknown 11389 1726854851.40530: calling self._execute() 11389 1726854851.40533: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854851.40535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854851.40539: variable 'omit' from source: magic vars 11389 1726854851.40873: variable 'ansible_distribution_major_version' from source: facts 11389 1726854851.40939: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854851.40942: variable 'omit' from source: magic vars 11389 1726854851.40945: variable 'omit' from source: magic vars 11389 1726854851.40973: variable 'omit' from source: magic vars 11389 1726854851.41019: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854851.41065: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854851.41094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854851.41119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854851.41156: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854851.41176: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854851.41189: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854851.41265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854851.41311: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854851.41325: Set connection var ansible_timeout to 10 11389 1726854851.41332: Set connection var ansible_connection to ssh 11389 1726854851.41342: Set connection var ansible_shell_type to sh 11389 1726854851.41352: Set connection var ansible_pipelining to False 11389 1726854851.41361: Set connection var ansible_shell_executable to /bin/sh 11389 1726854851.41393: variable 'ansible_shell_executable' from source: unknown 11389 1726854851.41403: variable 'ansible_connection' from source: unknown 11389 1726854851.41412: variable 'ansible_module_compression' from source: unknown 11389 1726854851.41419: variable 'ansible_shell_type' from source: unknown 11389 1726854851.41426: variable 'ansible_shell_executable' from source: unknown 11389 1726854851.41433: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854851.41442: variable 'ansible_pipelining' from source: unknown 11389 1726854851.41449: variable 'ansible_timeout' from source: unknown 11389 1726854851.41483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854851.41597: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854851.41611: variable 'omit' from source: magic vars 11389 1726854851.41621: starting attempt loop 11389 1726854851.41626: running the handler 11389 1726854851.41700: handler run complete 11389 1726854851.41703: attempt loop complete, returning result 11389 1726854851.41706: _execute() done 11389 1726854851.41707: dumping result to json 11389 1726854851.41712: done dumping result, returning 11389 1726854851.41723: done running TaskExecutor() for managed_node3/TASK: INIT Prepare setup [0affcc66-ac2b-deb8-c119-00000000000b] 11389 1726854851.41732: sending task result for task 0affcc66-ac2b-deb8-c119-00000000000b 11389 1726854851.41909: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000000b 11389 1726854851.41917: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 11389 1726854851.41963: no more pending results, returning what we have 11389 1726854851.41969: results queue empty 11389 1726854851.41970: checking for any_errors_fatal 11389 1726854851.41972: done checking for any_errors_fatal 11389 1726854851.41972: checking for max_fail_percentage 11389 1726854851.41974: done checking for max_fail_percentage 11389 1726854851.41975: checking to see if all hosts have failed and the running result is not ok 11389 1726854851.41976: done checking to see if all hosts have failed 11389 1726854851.41977: getting the remaining hosts for this loop 11389 1726854851.41978: done getting the remaining hosts for this loop 11389 1726854851.41982: getting the next task for host managed_node3 11389 1726854851.41990: done getting next task for host managed_node3 11389 1726854851.41993: ^ task is: TASK: Install dnsmasq 11389 1726854851.41996: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854851.42000: getting variables 11389 1726854851.42001: in VariableManager get_vars() 11389 1726854851.42039: Calling all_inventory to load vars for managed_node3 11389 1726854851.42042: Calling groups_inventory to load vars for managed_node3 11389 1726854851.42044: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854851.42053: Calling all_plugins_play to load vars for managed_node3 11389 1726854851.42055: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854851.42057: Calling groups_plugins_play to load vars for managed_node3 11389 1726854851.42264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854851.42450: done with get_vars() 11389 1726854851.42466: done getting variables 11389 1726854851.42525: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 13:54:11 -0400 (0:00:00.029) 0:00:03.848 ****** 11389 1726854851.42555: entering _queue_task() for managed_node3/package 11389 1726854851.42825: worker is 1 (out of 1 available) 11389 1726854851.42837: exiting _queue_task() for managed_node3/package 11389 1726854851.42847: done queuing things up, now waiting for results queue to drain 11389 1726854851.42848: waiting for pending results... 11389 1726854851.43049: running TaskExecutor() for managed_node3/TASK: Install dnsmasq 11389 1726854851.43202: in run() - task 0affcc66-ac2b-deb8-c119-00000000000f 11389 1726854851.43206: variable 'ansible_search_path' from source: unknown 11389 1726854851.43208: variable 'ansible_search_path' from source: unknown 11389 1726854851.43245: calling self._execute() 11389 1726854851.43370: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854851.43440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854851.43444: variable 'omit' from source: magic vars 11389 1726854851.43840: variable 'ansible_distribution_major_version' from source: facts 11389 1726854851.43936: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854851.43946: variable 'omit' from source: magic vars 11389 1726854851.44216: variable 'omit' from source: magic vars 11389 1726854851.44504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854851.46739: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854851.46885: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854851.46892: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854851.46914: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854851.46946: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854851.47051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854851.47083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854851.47118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854851.47158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854851.47173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854851.47281: variable '__network_is_ostree' from source: set_fact 11389 1726854851.47318: variable 'omit' from source: magic vars 11389 1726854851.47332: variable 'omit' from source: magic vars 11389 1726854851.47363: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854851.47394: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854851.47427: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854851.47536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854851.47539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854851.47542: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854851.47544: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854851.47546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854851.47596: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854851.47608: Set connection var ansible_timeout to 10 11389 1726854851.47613: Set connection var ansible_connection to ssh 11389 1726854851.47622: Set connection var ansible_shell_type to sh 11389 1726854851.47631: Set connection var ansible_pipelining to False 11389 1726854851.47646: Set connection var ansible_shell_executable to /bin/sh 11389 1726854851.47669: variable 'ansible_shell_executable' from source: unknown 11389 1726854851.47675: variable 'ansible_connection' from source: unknown 11389 1726854851.47680: variable 'ansible_module_compression' from source: unknown 11389 1726854851.47685: variable 'ansible_shell_type' from source: unknown 11389 1726854851.47692: variable 'ansible_shell_executable' from source: unknown 11389 1726854851.47697: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854851.47703: variable 'ansible_pipelining' from source: unknown 11389 1726854851.47707: variable 'ansible_timeout' from source: unknown 11389 1726854851.47713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854851.47816: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854851.47832: variable 'omit' from source: magic vars 11389 1726854851.47841: starting attempt loop 11389 1726854851.47847: running the handler 11389 1726854851.47866: variable 'ansible_facts' from source: unknown 11389 1726854851.47874: variable 'ansible_facts' from source: unknown 11389 1726854851.47913: _low_level_execute_command(): starting 11389 1726854851.47972: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854851.48636: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854851.48651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854851.48708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854851.48778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854851.48805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854851.48845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854851.48921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854851.50627: stdout chunk (state=3): >>>/root <<< 11389 1726854851.50767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854851.50799: stderr chunk (state=3): >>><<< 11389 1726854851.50802: stdout chunk (state=3): >>><<< 11389 1726854851.50823: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854851.50943: _low_level_execute_command(): starting 11389 1726854851.50947: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854851.5084205-11641-182417000487530 `" && echo ansible-tmp-1726854851.5084205-11641-182417000487530="` echo /root/.ansible/tmp/ansible-tmp-1726854851.5084205-11641-182417000487530 `" ) && sleep 0' 11389 1726854851.51528: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854851.51556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854851.51654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854851.51672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854851.51701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854851.51716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854851.51741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854851.51902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854851.53846: stdout chunk (state=3): >>>ansible-tmp-1726854851.5084205-11641-182417000487530=/root/.ansible/tmp/ansible-tmp-1726854851.5084205-11641-182417000487530 <<< 11389 1726854851.54053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854851.54057: stdout chunk (state=3): >>><<< 11389 1726854851.54059: stderr chunk (state=3): >>><<< 11389 1726854851.54254: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854851.5084205-11641-182417000487530=/root/.ansible/tmp/ansible-tmp-1726854851.5084205-11641-182417000487530 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854851.54259: variable 'ansible_module_compression' from source: unknown 11389 1726854851.54373: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 11389 1726854851.54389: ANSIBALLZ: Acquiring lock 11389 1726854851.54476: ANSIBALLZ: Lock acquired: 140464425326096 11389 1726854851.54480: ANSIBALLZ: Creating module 11389 1726854851.76415: ANSIBALLZ: Writing module into payload 11389 1726854851.76700: ANSIBALLZ: Writing module 11389 1726854851.76722: ANSIBALLZ: Renaming module 11389 1726854851.76727: ANSIBALLZ: Done creating module 11389 1726854851.76789: variable 'ansible_facts' from source: unknown 11389 1726854851.77054: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854851.5084205-11641-182417000487530/AnsiballZ_dnf.py 11389 1726854851.77594: Sending initial data 11389 1726854851.77597: Sent initial data (152 bytes) 11389 1726854851.78767: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854851.78789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854851.78806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854851.78855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854851.78891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854851.78912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854851.78964: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854851.79031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854851.79081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854851.79112: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854851.79209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854851.80934: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854851.81170: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854851.81230: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmp1chw7wfk /root/.ansible/tmp/ansible-tmp-1726854851.5084205-11641-182417000487530/AnsiballZ_dnf.py <<< 11389 1726854851.81248: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854851.5084205-11641-182417000487530/AnsiballZ_dnf.py" <<< 11389 1726854851.81293: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmp1chw7wfk" to remote "/root/.ansible/tmp/ansible-tmp-1726854851.5084205-11641-182417000487530/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854851.5084205-11641-182417000487530/AnsiballZ_dnf.py" <<< 11389 1726854851.83149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854851.83153: stderr chunk (state=3): >>><<< 11389 1726854851.83156: stdout chunk (state=3): >>><<< 11389 1726854851.83261: done transferring module to remote 11389 1726854851.83264: _low_level_execute_command(): starting 11389 1726854851.83266: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854851.5084205-11641-182417000487530/ /root/.ansible/tmp/ansible-tmp-1726854851.5084205-11641-182417000487530/AnsiballZ_dnf.py && sleep 0' 11389 1726854851.84085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854851.84104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854851.84115: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854851.84169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854851.84183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854851.84543: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854851.84584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854851.86393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854851.86398: stdout chunk (state=3): >>><<< 11389 1726854851.86403: stderr chunk (state=3): >>><<< 11389 1726854851.86435: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854851.86501: _low_level_execute_command(): starting 11389 1726854851.86507: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854851.5084205-11641-182417000487530/AnsiballZ_dnf.py && sleep 0' 11389 1726854851.88201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854851.88536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854851.88631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854852.29526: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11389 1726854852.33760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854852.33765: stdout chunk (state=3): >>><<< 11389 1726854852.33767: stderr chunk (state=3): >>><<< 11389 1726854852.33770: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854852.33777: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854851.5084205-11641-182417000487530/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854852.33779: _low_level_execute_command(): starting 11389 1726854852.33782: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854851.5084205-11641-182417000487530/ > /dev/null 2>&1 && sleep 0' 11389 1726854852.34598: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854852.34611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854852.34617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854852.34631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854852.34703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854852.34733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854852.34757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854852.34770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854852.34861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854852.36994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854852.36999: stdout chunk (state=3): >>><<< 11389 1726854852.37006: stderr chunk (state=3): >>><<< 11389 1726854852.37009: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854852.37011: handler run complete 11389 1726854852.37238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854852.37556: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854852.37846: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854852.37849: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854852.37852: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854852.38019: variable '__install_status' from source: unknown 11389 1726854852.38044: Evaluated conditional (__install_status is success): True 11389 1726854852.38298: attempt loop complete, returning result 11389 1726854852.38301: _execute() done 11389 1726854852.38304: dumping result to json 11389 1726854852.38306: done dumping result, returning 11389 1726854852.38308: done running TaskExecutor() for managed_node3/TASK: Install dnsmasq [0affcc66-ac2b-deb8-c119-00000000000f] 11389 1726854852.38310: sending task result for task 0affcc66-ac2b-deb8-c119-00000000000f 11389 1726854852.38375: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000000f 11389 1726854852.38377: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11389 1726854852.38481: no more pending results, returning what we have 11389 1726854852.38484: results queue empty 11389 1726854852.38484: checking for any_errors_fatal 11389 1726854852.38492: done checking for any_errors_fatal 11389 1726854852.38493: checking for max_fail_percentage 11389 1726854852.38495: done checking for max_fail_percentage 11389 1726854852.38496: checking to see if all hosts have failed and the running result is not ok 11389 1726854852.38497: done checking to see if all hosts have failed 11389 1726854852.38497: getting the remaining hosts for this loop 11389 1726854852.38499: done getting the remaining hosts for this loop 11389 1726854852.38502: getting the next task for host managed_node3 11389 1726854852.38509: done getting next task for host managed_node3 11389 1726854852.38511: ^ task is: TASK: Install pgrep, sysctl 11389 1726854852.38513: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854852.38517: getting variables 11389 1726854852.38518: in VariableManager get_vars() 11389 1726854852.38555: Calling all_inventory to load vars for managed_node3 11389 1726854852.38557: Calling groups_inventory to load vars for managed_node3 11389 1726854852.38559: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854852.38572: Calling all_plugins_play to load vars for managed_node3 11389 1726854852.38574: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854852.38576: Calling groups_plugins_play to load vars for managed_node3 11389 1726854852.38999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854852.39198: done with get_vars() 11389 1726854852.39209: done getting variables 11389 1726854852.39273: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 13:54:12 -0400 (0:00:00.967) 0:00:04.816 ****** 11389 1726854852.39311: entering _queue_task() for managed_node3/package 11389 1726854852.39600: worker is 1 (out of 1 available) 11389 1726854852.39614: exiting _queue_task() for managed_node3/package 11389 1726854852.39626: done queuing things up, now waiting for results queue to drain 11389 1726854852.39628: waiting for pending results... 11389 1726854852.39878: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 11389 1726854852.40006: in run() - task 0affcc66-ac2b-deb8-c119-000000000010 11389 1726854852.40025: variable 'ansible_search_path' from source: unknown 11389 1726854852.40033: variable 'ansible_search_path' from source: unknown 11389 1726854852.40076: calling self._execute() 11389 1726854852.40171: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854852.40201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854852.40221: variable 'omit' from source: magic vars 11389 1726854852.40927: variable 'ansible_distribution_major_version' from source: facts 11389 1726854852.40944: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854852.41179: variable 'ansible_os_family' from source: facts 11389 1726854852.41456: Evaluated conditional (ansible_os_family == 'RedHat'): True 11389 1726854852.41556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854852.42045: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854852.42102: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854852.42184: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854852.42231: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854852.42321: variable 'ansible_distribution_major_version' from source: facts 11389 1726854852.42345: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 11389 1726854852.42353: when evaluation is False, skipping this task 11389 1726854852.42360: _execute() done 11389 1726854852.42366: dumping result to json 11389 1726854852.42379: done dumping result, returning 11389 1726854852.42397: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [0affcc66-ac2b-deb8-c119-000000000010] 11389 1726854852.42412: sending task result for task 0affcc66-ac2b-deb8-c119-000000000010 11389 1726854852.42694: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000010 11389 1726854852.42701: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 11389 1726854852.42751: no more pending results, returning what we have 11389 1726854852.42755: results queue empty 11389 1726854852.42756: checking for any_errors_fatal 11389 1726854852.42762: done checking for any_errors_fatal 11389 1726854852.42763: checking for max_fail_percentage 11389 1726854852.42765: done checking for max_fail_percentage 11389 1726854852.42766: checking to see if all hosts have failed and the running result is not ok 11389 1726854852.42770: done checking to see if all hosts have failed 11389 1726854852.42771: getting the remaining hosts for this loop 11389 1726854852.42773: done getting the remaining hosts for this loop 11389 1726854852.42776: getting the next task for host managed_node3 11389 1726854852.42783: done getting next task for host managed_node3 11389 1726854852.42786: ^ task is: TASK: Install pgrep, sysctl 11389 1726854852.42791: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854852.42795: getting variables 11389 1726854852.42797: in VariableManager get_vars() 11389 1726854852.42839: Calling all_inventory to load vars for managed_node3 11389 1726854852.42842: Calling groups_inventory to load vars for managed_node3 11389 1726854852.42845: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854852.42857: Calling all_plugins_play to load vars for managed_node3 11389 1726854852.42860: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854852.42863: Calling groups_plugins_play to load vars for managed_node3 11389 1726854852.43173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854852.43370: done with get_vars() 11389 1726854852.43383: done getting variables 11389 1726854852.43443: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 13:54:12 -0400 (0:00:00.041) 0:00:04.857 ****** 11389 1726854852.43475: entering _queue_task() for managed_node3/package 11389 1726854852.43759: worker is 1 (out of 1 available) 11389 1726854852.43775: exiting _queue_task() for managed_node3/package 11389 1726854852.43788: done queuing things up, now waiting for results queue to drain 11389 1726854852.43790: waiting for pending results... 11389 1726854852.44005: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 11389 1726854852.44059: in run() - task 0affcc66-ac2b-deb8-c119-000000000011 11389 1726854852.44073: variable 'ansible_search_path' from source: unknown 11389 1726854852.44077: variable 'ansible_search_path' from source: unknown 11389 1726854852.44104: calling self._execute() 11389 1726854852.44292: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854852.44297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854852.44300: variable 'omit' from source: magic vars 11389 1726854852.44594: variable 'ansible_distribution_major_version' from source: facts 11389 1726854852.44610: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854852.44725: variable 'ansible_os_family' from source: facts 11389 1726854852.44738: Evaluated conditional (ansible_os_family == 'RedHat'): True 11389 1726854852.44890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854852.45135: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854852.45182: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854852.45217: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854852.45278: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854852.45324: variable 'ansible_distribution_major_version' from source: facts 11389 1726854852.45339: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 11389 1726854852.45348: variable 'omit' from source: magic vars 11389 1726854852.45397: variable 'omit' from source: magic vars 11389 1726854852.45535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854852.47124: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854852.47171: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854852.47196: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854852.47492: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854852.47495: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854852.47532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854852.47560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854852.47593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854852.47638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854852.47655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854852.47759: variable '__network_is_ostree' from source: set_fact 11389 1726854852.47769: variable 'omit' from source: magic vars 11389 1726854852.47808: variable 'omit' from source: magic vars 11389 1726854852.47839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854852.47862: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854852.47879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854852.47893: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854852.47903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854852.47925: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854852.47928: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854852.47931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854852.48004: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854852.48010: Set connection var ansible_timeout to 10 11389 1726854852.48013: Set connection var ansible_connection to ssh 11389 1726854852.48017: Set connection var ansible_shell_type to sh 11389 1726854852.48022: Set connection var ansible_pipelining to False 11389 1726854852.48027: Set connection var ansible_shell_executable to /bin/sh 11389 1726854852.48048: variable 'ansible_shell_executable' from source: unknown 11389 1726854852.48051: variable 'ansible_connection' from source: unknown 11389 1726854852.48053: variable 'ansible_module_compression' from source: unknown 11389 1726854852.48055: variable 'ansible_shell_type' from source: unknown 11389 1726854852.48057: variable 'ansible_shell_executable' from source: unknown 11389 1726854852.48060: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854852.48064: variable 'ansible_pipelining' from source: unknown 11389 1726854852.48066: variable 'ansible_timeout' from source: unknown 11389 1726854852.48073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854852.48141: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854852.48153: variable 'omit' from source: magic vars 11389 1726854852.48159: starting attempt loop 11389 1726854852.48163: running the handler 11389 1726854852.48165: variable 'ansible_facts' from source: unknown 11389 1726854852.48167: variable 'ansible_facts' from source: unknown 11389 1726854852.48196: _low_level_execute_command(): starting 11389 1726854852.48201: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854852.48698: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854852.48703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 11389 1726854852.48705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854852.48761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854852.48764: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854852.48766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854852.48842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854852.50560: stdout chunk (state=3): >>>/root <<< 11389 1726854852.50656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854852.50694: stderr chunk (state=3): >>><<< 11389 1726854852.50697: stdout chunk (state=3): >>><<< 11389 1726854852.50714: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854852.50724: _low_level_execute_command(): starting 11389 1726854852.50730: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854852.507146-11724-31265337721329 `" && echo ansible-tmp-1726854852.507146-11724-31265337721329="` echo /root/.ansible/tmp/ansible-tmp-1726854852.507146-11724-31265337721329 `" ) && sleep 0' 11389 1726854852.51191: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854852.51195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854852.51198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854852.51201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854852.51254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854852.51257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854852.51261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854852.51322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854852.53199: stdout chunk (state=3): >>>ansible-tmp-1726854852.507146-11724-31265337721329=/root/.ansible/tmp/ansible-tmp-1726854852.507146-11724-31265337721329 <<< 11389 1726854852.53304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854852.53331: stderr chunk (state=3): >>><<< 11389 1726854852.53334: stdout chunk (state=3): >>><<< 11389 1726854852.53352: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854852.507146-11724-31265337721329=/root/.ansible/tmp/ansible-tmp-1726854852.507146-11724-31265337721329 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854852.53379: variable 'ansible_module_compression' from source: unknown 11389 1726854852.53425: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11389 1726854852.53462: variable 'ansible_facts' from source: unknown 11389 1726854852.53541: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854852.507146-11724-31265337721329/AnsiballZ_dnf.py 11389 1726854852.53642: Sending initial data 11389 1726854852.53645: Sent initial data (150 bytes) 11389 1726854852.54217: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854852.54239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854852.54321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854852.55878: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854852.55939: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854852.55998: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmp6az2aesc /root/.ansible/tmp/ansible-tmp-1726854852.507146-11724-31265337721329/AnsiballZ_dnf.py <<< 11389 1726854852.56001: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854852.507146-11724-31265337721329/AnsiballZ_dnf.py" <<< 11389 1726854852.56053: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmp6az2aesc" to remote "/root/.ansible/tmp/ansible-tmp-1726854852.507146-11724-31265337721329/AnsiballZ_dnf.py" <<< 11389 1726854852.56057: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854852.507146-11724-31265337721329/AnsiballZ_dnf.py" <<< 11389 1726854852.57094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854852.57098: stdout chunk (state=3): >>><<< 11389 1726854852.57101: stderr chunk (state=3): >>><<< 11389 1726854852.57103: done transferring module to remote 11389 1726854852.57105: _low_level_execute_command(): starting 11389 1726854852.57108: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854852.507146-11724-31265337721329/ /root/.ansible/tmp/ansible-tmp-1726854852.507146-11724-31265337721329/AnsiballZ_dnf.py && sleep 0' 11389 1726854852.57800: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854852.57814: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854852.57854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854852.57872: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 11389 1726854852.57959: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854852.57992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854852.58081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854852.59859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854852.59923: stdout chunk (state=3): >>><<< 11389 1726854852.59927: stderr chunk (state=3): >>><<< 11389 1726854852.59949: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854852.60004: _low_level_execute_command(): starting 11389 1726854852.60008: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854852.507146-11724-31265337721329/AnsiballZ_dnf.py && sleep 0' 11389 1726854852.60649: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854852.60675: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854852.60786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854852.60806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854852.60829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854852.60921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854853.01515: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11389 1726854853.06875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854853.06881: stdout chunk (state=3): >>><<< 11389 1726854853.06883: stderr chunk (state=3): >>><<< 11389 1726854853.07094: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854853.07105: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854852.507146-11724-31265337721329/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854853.07200: _low_level_execute_command(): starting 11389 1726854853.07207: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854852.507146-11724-31265337721329/ > /dev/null 2>&1 && sleep 0' 11389 1726854853.08429: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854853.08434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854853.08780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854853.08831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854853.09018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854853.10939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854853.10943: stdout chunk (state=3): >>><<< 11389 1726854853.10946: stderr chunk (state=3): >>><<< 11389 1726854853.10964: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854853.10980: handler run complete 11389 1726854853.11024: attempt loop complete, returning result 11389 1726854853.11293: _execute() done 11389 1726854853.11296: dumping result to json 11389 1726854853.11299: done dumping result, returning 11389 1726854853.11301: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [0affcc66-ac2b-deb8-c119-000000000011] 11389 1726854853.11303: sending task result for task 0affcc66-ac2b-deb8-c119-000000000011 ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11389 1726854853.11454: no more pending results, returning what we have 11389 1726854853.11458: results queue empty 11389 1726854853.11459: checking for any_errors_fatal 11389 1726854853.11465: done checking for any_errors_fatal 11389 1726854853.11466: checking for max_fail_percentage 11389 1726854853.11470: done checking for max_fail_percentage 11389 1726854853.11471: checking to see if all hosts have failed and the running result is not ok 11389 1726854853.11472: done checking to see if all hosts have failed 11389 1726854853.11473: getting the remaining hosts for this loop 11389 1726854853.11474: done getting the remaining hosts for this loop 11389 1726854853.11478: getting the next task for host managed_node3 11389 1726854853.11484: done getting next task for host managed_node3 11389 1726854853.11489: ^ task is: TASK: Create test interfaces 11389 1726854853.11492: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854853.11495: getting variables 11389 1726854853.11497: in VariableManager get_vars() 11389 1726854853.11537: Calling all_inventory to load vars for managed_node3 11389 1726854853.11541: Calling groups_inventory to load vars for managed_node3 11389 1726854853.11544: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854853.11555: Calling all_plugins_play to load vars for managed_node3 11389 1726854853.11558: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854853.11562: Calling groups_plugins_play to load vars for managed_node3 11389 1726854853.12395: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000011 11389 1726854853.12399: WORKER PROCESS EXITING 11389 1726854853.12500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854853.12708: done with get_vars() 11389 1726854853.12720: done getting variables 11389 1726854853.13014: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 13:54:13 -0400 (0:00:00.695) 0:00:05.553 ****** 11389 1726854853.13046: entering _queue_task() for managed_node3/shell 11389 1726854853.13048: Creating lock for shell 11389 1726854853.14018: worker is 1 (out of 1 available) 11389 1726854853.14029: exiting _queue_task() for managed_node3/shell 11389 1726854853.14038: done queuing things up, now waiting for results queue to drain 11389 1726854853.14039: waiting for pending results... 11389 1726854853.14219: running TaskExecutor() for managed_node3/TASK: Create test interfaces 11389 1726854853.14462: in run() - task 0affcc66-ac2b-deb8-c119-000000000012 11389 1726854853.14488: variable 'ansible_search_path' from source: unknown 11389 1726854853.14678: variable 'ansible_search_path' from source: unknown 11389 1726854853.14681: calling self._execute() 11389 1726854853.14806: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854853.14820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854853.14836: variable 'omit' from source: magic vars 11389 1726854853.15622: variable 'ansible_distribution_major_version' from source: facts 11389 1726854853.15811: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854853.15815: variable 'omit' from source: magic vars 11389 1726854853.15818: variable 'omit' from source: magic vars 11389 1726854853.16620: variable 'dhcp_interface1' from source: play vars 11389 1726854853.16694: variable 'dhcp_interface2' from source: play vars 11389 1726854853.16732: variable 'omit' from source: magic vars 11389 1726854853.16834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854853.16876: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854853.17093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854853.17096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854853.17099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854853.17101: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854853.17103: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854853.17105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854853.17327: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854853.17445: Set connection var ansible_timeout to 10 11389 1726854853.17448: Set connection var ansible_connection to ssh 11389 1726854853.17451: Set connection var ansible_shell_type to sh 11389 1726854853.17453: Set connection var ansible_pipelining to False 11389 1726854853.17456: Set connection var ansible_shell_executable to /bin/sh 11389 1726854853.17892: variable 'ansible_shell_executable' from source: unknown 11389 1726854853.17895: variable 'ansible_connection' from source: unknown 11389 1726854853.17898: variable 'ansible_module_compression' from source: unknown 11389 1726854853.17900: variable 'ansible_shell_type' from source: unknown 11389 1726854853.17902: variable 'ansible_shell_executable' from source: unknown 11389 1726854853.17904: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854853.17905: variable 'ansible_pipelining' from source: unknown 11389 1726854853.17907: variable 'ansible_timeout' from source: unknown 11389 1726854853.17910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854853.17914: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854853.17917: variable 'omit' from source: magic vars 11389 1726854853.17919: starting attempt loop 11389 1726854853.17921: running the handler 11389 1726854853.17923: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854853.17943: _low_level_execute_command(): starting 11389 1726854853.17956: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854853.19458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854853.19606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854853.19689: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854853.19808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854853.19834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854853.19927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854853.21628: stdout chunk (state=3): >>>/root <<< 11389 1726854853.21755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854853.21800: stderr chunk (state=3): >>><<< 11389 1726854853.21863: stdout chunk (state=3): >>><<< 11389 1726854853.21899: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854853.21918: _low_level_execute_command(): starting 11389 1726854853.21976: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854853.2190576-11777-221522900283048 `" && echo ansible-tmp-1726854853.2190576-11777-221522900283048="` echo /root/.ansible/tmp/ansible-tmp-1726854853.2190576-11777-221522900283048 `" ) && sleep 0' 11389 1726854853.23194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854853.23264: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854853.23292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854853.23314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854853.23529: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854853.23553: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854853.23660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854853.25731: stdout chunk (state=3): >>>ansible-tmp-1726854853.2190576-11777-221522900283048=/root/.ansible/tmp/ansible-tmp-1726854853.2190576-11777-221522900283048 <<< 11389 1726854853.25805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854853.25871: stderr chunk (state=3): >>><<< 11389 1726854853.25882: stdout chunk (state=3): >>><<< 11389 1726854853.26079: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854853.2190576-11777-221522900283048=/root/.ansible/tmp/ansible-tmp-1726854853.2190576-11777-221522900283048 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854853.26083: variable 'ansible_module_compression' from source: unknown 11389 1726854853.26200: ANSIBALLZ: Using generic lock for ansible.legacy.command 11389 1726854853.26208: ANSIBALLZ: Acquiring lock 11389 1726854853.26215: ANSIBALLZ: Lock acquired: 140464425326096 11389 1726854853.26223: ANSIBALLZ: Creating module 11389 1726854853.42554: ANSIBALLZ: Writing module into payload 11389 1726854853.42661: ANSIBALLZ: Writing module 11389 1726854853.42700: ANSIBALLZ: Renaming module 11389 1726854853.42718: ANSIBALLZ: Done creating module 11389 1726854853.42742: variable 'ansible_facts' from source: unknown 11389 1726854853.42829: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854853.2190576-11777-221522900283048/AnsiballZ_command.py 11389 1726854853.43018: Sending initial data 11389 1726854853.43026: Sent initial data (156 bytes) 11389 1726854853.44202: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854853.44496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854853.44502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854853.46162: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854853.46222: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854853.46307: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmp52rp84x9 /root/.ansible/tmp/ansible-tmp-1726854853.2190576-11777-221522900283048/AnsiballZ_command.py <<< 11389 1726854853.46311: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854853.2190576-11777-221522900283048/AnsiballZ_command.py" <<< 11389 1726854853.46396: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmp52rp84x9" to remote "/root/.ansible/tmp/ansible-tmp-1726854853.2190576-11777-221522900283048/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854853.2190576-11777-221522900283048/AnsiballZ_command.py" <<< 11389 1726854853.47749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854853.47753: stderr chunk (state=3): >>><<< 11389 1726854853.47755: stdout chunk (state=3): >>><<< 11389 1726854853.47901: done transferring module to remote 11389 1726854853.47913: _low_level_execute_command(): starting 11389 1726854853.47918: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854853.2190576-11777-221522900283048/ /root/.ansible/tmp/ansible-tmp-1726854853.2190576-11777-221522900283048/AnsiballZ_command.py && sleep 0' 11389 1726854853.49155: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854853.49159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854853.49173: stderr chunk (state=3): >>>debug2: match not found <<< 11389 1726854853.49177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854853.49406: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 11389 1726854853.49410: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854853.49413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854853.49415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854853.49601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854853.49633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854853.51461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854853.51523: stderr chunk (state=3): >>><<< 11389 1726854853.51527: stdout chunk (state=3): >>><<< 11389 1726854853.51546: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854853.51549: _low_level_execute_command(): starting 11389 1726854853.51558: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854853.2190576-11777-221522900283048/AnsiballZ_command.py && sleep 0' 11389 1726854853.52907: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854853.52912: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854853.52941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854853.52967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 11389 1726854853.52973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854853.53179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854853.53182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854853.53207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854853.53301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854854.89991: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 707 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 707 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! <<< 11389 1726854854.90013: stdout chunk (state=3): >>>firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 13:54:13.682699", "end": "2024-09-20 13:54:14.898652", "delta": "0:00:01.215953", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11389 1726854854.91557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854854.91581: stdout chunk (state=3): >>><<< 11389 1726854854.91604: stderr chunk (state=3): >>><<< 11389 1726854854.91643: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 707 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 707 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 13:54:13.682699", "end": "2024-09-20 13:54:14.898652", "delta": "0:00:01.215953", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854854.91705: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854853.2190576-11777-221522900283048/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854854.91712: _low_level_execute_command(): starting 11389 1726854854.91717: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854853.2190576-11777-221522900283048/ > /dev/null 2>&1 && sleep 0' 11389 1726854854.92169: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854854.92173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854854.92175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854854.92177: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854854.92180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854854.92232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854854.92239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854854.92298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854854.94176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854854.94180: stdout chunk (state=3): >>><<< 11389 1726854854.94393: stderr chunk (state=3): >>><<< 11389 1726854854.94396: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854854.94399: handler run complete 11389 1726854854.94401: Evaluated conditional (False): False 11389 1726854854.94402: attempt loop complete, returning result 11389 1726854854.94404: _execute() done 11389 1726854854.94405: dumping result to json 11389 1726854854.94407: done dumping result, returning 11389 1726854854.94409: done running TaskExecutor() for managed_node3/TASK: Create test interfaces [0affcc66-ac2b-deb8-c119-000000000012] 11389 1726854854.94410: sending task result for task 0affcc66-ac2b-deb8-c119-000000000012 11389 1726854854.94473: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000012 11389 1726854854.94476: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.215953", "end": "2024-09-20 13:54:14.898652", "rc": 0, "start": "2024-09-20 13:54:13.682699" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 707 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 707 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 11389 1726854854.94586: no more pending results, returning what we have 11389 1726854854.94595: results queue empty 11389 1726854854.94596: checking for any_errors_fatal 11389 1726854854.94603: done checking for any_errors_fatal 11389 1726854854.94604: checking for max_fail_percentage 11389 1726854854.94692: done checking for max_fail_percentage 11389 1726854854.94694: checking to see if all hosts have failed and the running result is not ok 11389 1726854854.94695: done checking to see if all hosts have failed 11389 1726854854.94696: getting the remaining hosts for this loop 11389 1726854854.94699: done getting the remaining hosts for this loop 11389 1726854854.94702: getting the next task for host managed_node3 11389 1726854854.94711: done getting next task for host managed_node3 11389 1726854854.94716: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11389 1726854854.94719: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854854.94723: getting variables 11389 1726854854.94724: in VariableManager get_vars() 11389 1726854854.94766: Calling all_inventory to load vars for managed_node3 11389 1726854854.94772: Calling groups_inventory to load vars for managed_node3 11389 1726854854.94775: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854854.94906: Calling all_plugins_play to load vars for managed_node3 11389 1726854854.94911: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854854.94915: Calling groups_plugins_play to load vars for managed_node3 11389 1726854854.95152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854854.95518: done with get_vars() 11389 1726854854.95527: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:54:14 -0400 (0:00:01.825) 0:00:07.379 ****** 11389 1726854854.95620: entering _queue_task() for managed_node3/include_tasks 11389 1726854854.95846: worker is 1 (out of 1 available) 11389 1726854854.95860: exiting _queue_task() for managed_node3/include_tasks 11389 1726854854.95871: done queuing things up, now waiting for results queue to drain 11389 1726854854.95873: waiting for pending results... 11389 1726854854.96030: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 11389 1726854854.96105: in run() - task 0affcc66-ac2b-deb8-c119-000000000016 11389 1726854854.96116: variable 'ansible_search_path' from source: unknown 11389 1726854854.96120: variable 'ansible_search_path' from source: unknown 11389 1726854854.96145: calling self._execute() 11389 1726854854.96209: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854854.96212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854854.96224: variable 'omit' from source: magic vars 11389 1726854854.96483: variable 'ansible_distribution_major_version' from source: facts 11389 1726854854.96494: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854854.96499: _execute() done 11389 1726854854.96502: dumping result to json 11389 1726854854.96505: done dumping result, returning 11389 1726854854.96512: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcc66-ac2b-deb8-c119-000000000016] 11389 1726854854.96517: sending task result for task 0affcc66-ac2b-deb8-c119-000000000016 11389 1726854854.96601: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000016 11389 1726854854.96604: WORKER PROCESS EXITING 11389 1726854854.96662: no more pending results, returning what we have 11389 1726854854.96666: in VariableManager get_vars() 11389 1726854854.96707: Calling all_inventory to load vars for managed_node3 11389 1726854854.96710: Calling groups_inventory to load vars for managed_node3 11389 1726854854.96711: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854854.96722: Calling all_plugins_play to load vars for managed_node3 11389 1726854854.96724: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854854.96727: Calling groups_plugins_play to load vars for managed_node3 11389 1726854854.96843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854854.96956: done with get_vars() 11389 1726854854.96961: variable 'ansible_search_path' from source: unknown 11389 1726854854.96962: variable 'ansible_search_path' from source: unknown 11389 1726854854.96993: we have included files to process 11389 1726854854.96994: generating all_blocks data 11389 1726854854.96995: done generating all_blocks data 11389 1726854854.96995: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11389 1726854854.96996: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11389 1726854854.96997: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11389 1726854854.97154: done processing included file 11389 1726854854.97155: iterating over new_blocks loaded from include file 11389 1726854854.97156: in VariableManager get_vars() 11389 1726854854.97171: done with get_vars() 11389 1726854854.97172: filtering new block on tags 11389 1726854854.97182: done filtering new block on tags 11389 1726854854.97183: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 11389 1726854854.97188: extending task lists for all hosts with included blocks 11389 1726854854.97247: done extending task lists 11389 1726854854.97248: done processing included files 11389 1726854854.97248: results queue empty 11389 1726854854.97249: checking for any_errors_fatal 11389 1726854854.97253: done checking for any_errors_fatal 11389 1726854854.97253: checking for max_fail_percentage 11389 1726854854.97254: done checking for max_fail_percentage 11389 1726854854.97255: checking to see if all hosts have failed and the running result is not ok 11389 1726854854.97255: done checking to see if all hosts have failed 11389 1726854854.97256: getting the remaining hosts for this loop 11389 1726854854.97256: done getting the remaining hosts for this loop 11389 1726854854.97258: getting the next task for host managed_node3 11389 1726854854.97261: done getting next task for host managed_node3 11389 1726854854.97262: ^ task is: TASK: Get stat for interface {{ interface }} 11389 1726854854.97264: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854854.97265: getting variables 11389 1726854854.97266: in VariableManager get_vars() 11389 1726854854.97276: Calling all_inventory to load vars for managed_node3 11389 1726854854.97278: Calling groups_inventory to load vars for managed_node3 11389 1726854854.97279: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854854.97282: Calling all_plugins_play to load vars for managed_node3 11389 1726854854.97284: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854854.97285: Calling groups_plugins_play to load vars for managed_node3 11389 1726854854.97421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854854.97624: done with get_vars() 11389 1726854854.97633: done getting variables 11389 1726854854.97796: variable 'interface' from source: task vars 11389 1726854854.97801: variable 'dhcp_interface1' from source: play vars 11389 1726854854.97874: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:54:14 -0400 (0:00:00.022) 0:00:07.402 ****** 11389 1726854854.97916: entering _queue_task() for managed_node3/stat 11389 1726854854.98173: worker is 1 (out of 1 available) 11389 1726854854.98185: exiting _queue_task() for managed_node3/stat 11389 1726854854.98196: done queuing things up, now waiting for results queue to drain 11389 1726854854.98198: waiting for pending results... 11389 1726854854.98611: running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 11389 1726854854.98616: in run() - task 0affcc66-ac2b-deb8-c119-000000000152 11389 1726854854.98620: variable 'ansible_search_path' from source: unknown 11389 1726854854.98622: variable 'ansible_search_path' from source: unknown 11389 1726854854.98642: calling self._execute() 11389 1726854854.98741: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854854.98751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854854.98781: variable 'omit' from source: magic vars 11389 1726854854.99134: variable 'ansible_distribution_major_version' from source: facts 11389 1726854854.99256: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854854.99260: variable 'omit' from source: magic vars 11389 1726854854.99263: variable 'omit' from source: magic vars 11389 1726854854.99602: variable 'interface' from source: task vars 11389 1726854854.99605: variable 'dhcp_interface1' from source: play vars 11389 1726854854.99607: variable 'dhcp_interface1' from source: play vars 11389 1726854854.99670: variable 'omit' from source: magic vars 11389 1726854854.99727: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854854.99770: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854854.99798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854854.99824: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854854.99849: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854854.99892: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854854.99902: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854854.99910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854855.00021: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854855.00092: Set connection var ansible_timeout to 10 11389 1726854855.00095: Set connection var ansible_connection to ssh 11389 1726854855.00097: Set connection var ansible_shell_type to sh 11389 1726854855.00099: Set connection var ansible_pipelining to False 11389 1726854855.00102: Set connection var ansible_shell_executable to /bin/sh 11389 1726854855.00104: variable 'ansible_shell_executable' from source: unknown 11389 1726854855.00106: variable 'ansible_connection' from source: unknown 11389 1726854855.00108: variable 'ansible_module_compression' from source: unknown 11389 1726854855.00110: variable 'ansible_shell_type' from source: unknown 11389 1726854855.00112: variable 'ansible_shell_executable' from source: unknown 11389 1726854855.00114: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854855.00121: variable 'ansible_pipelining' from source: unknown 11389 1726854855.00128: variable 'ansible_timeout' from source: unknown 11389 1726854855.00135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854855.00344: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11389 1726854855.00366: variable 'omit' from source: magic vars 11389 1726854855.00473: starting attempt loop 11389 1726854855.00477: running the handler 11389 1726854855.00479: _low_level_execute_command(): starting 11389 1726854855.00481: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854855.01566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854855.01610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854855.01634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854855.01654: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854855.01761: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854855.01813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854855.01945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854855.03727: stdout chunk (state=3): >>>/root <<< 11389 1726854855.03792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854855.03804: stdout chunk (state=3): >>><<< 11389 1726854855.03816: stderr chunk (state=3): >>><<< 11389 1726854855.03840: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854855.03861: _low_level_execute_command(): starting 11389 1726854855.03873: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854855.0384717-11828-157382496175056 `" && echo ansible-tmp-1726854855.0384717-11828-157382496175056="` echo /root/.ansible/tmp/ansible-tmp-1726854855.0384717-11828-157382496175056 `" ) && sleep 0' 11389 1726854855.04505: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854855.04517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854855.04532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854855.04550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854855.04577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854855.04593: stderr chunk (state=3): >>>debug2: match not found <<< 11389 1726854855.04608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854855.04628: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11389 1726854855.04704: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854855.04726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854855.04745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854855.04768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854855.04859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854855.06829: stdout chunk (state=3): >>>ansible-tmp-1726854855.0384717-11828-157382496175056=/root/.ansible/tmp/ansible-tmp-1726854855.0384717-11828-157382496175056 <<< 11389 1726854855.07014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854855.07017: stdout chunk (state=3): >>><<< 11389 1726854855.07020: stderr chunk (state=3): >>><<< 11389 1726854855.07022: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854855.0384717-11828-157382496175056=/root/.ansible/tmp/ansible-tmp-1726854855.0384717-11828-157382496175056 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854855.07072: variable 'ansible_module_compression' from source: unknown 11389 1726854855.07139: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11389 1726854855.07179: variable 'ansible_facts' from source: unknown 11389 1726854855.07283: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854855.0384717-11828-157382496175056/AnsiballZ_stat.py 11389 1726854855.07483: Sending initial data 11389 1726854855.07486: Sent initial data (153 bytes) 11389 1726854855.07975: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854855.07979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854855.07984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11389 1726854855.07986: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854855.08002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854855.08044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854855.08058: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854855.08123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854855.09956: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854855.10012: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854855.10089: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmpht2vhfko /root/.ansible/tmp/ansible-tmp-1726854855.0384717-11828-157382496175056/AnsiballZ_stat.py <<< 11389 1726854855.10093: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854855.0384717-11828-157382496175056/AnsiballZ_stat.py" <<< 11389 1726854855.10294: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmpht2vhfko" to remote "/root/.ansible/tmp/ansible-tmp-1726854855.0384717-11828-157382496175056/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854855.0384717-11828-157382496175056/AnsiballZ_stat.py" <<< 11389 1726854855.11551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854855.11616: stderr chunk (state=3): >>><<< 11389 1726854855.11619: stdout chunk (state=3): >>><<< 11389 1726854855.11673: done transferring module to remote 11389 1726854855.11685: _low_level_execute_command(): starting 11389 1726854855.11691: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854855.0384717-11828-157382496175056/ /root/.ansible/tmp/ansible-tmp-1726854855.0384717-11828-157382496175056/AnsiballZ_stat.py && sleep 0' 11389 1726854855.12821: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854855.12832: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854855.12902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854855.13273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854855.13276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854855.13278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854855.15074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854855.15114: stderr chunk (state=3): >>><<< 11389 1726854855.15120: stdout chunk (state=3): >>><<< 11389 1726854855.15140: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854855.15143: _low_level_execute_command(): starting 11389 1726854855.15148: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854855.0384717-11828-157382496175056/AnsiballZ_stat.py && sleep 0' 11389 1726854855.16371: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854855.16604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854855.16699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854855.31910: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27482, "dev": 23, "nlink": 1, "atime": 1726854853.6892684, "mtime": 1726854853.6892684, "ctime": 1726854853.6892684, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11389 1726854855.33165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854855.33194: stderr chunk (state=3): >>>Shared connection to 10.31.9.244 closed. <<< 11389 1726854855.33247: stderr chunk (state=3): >>><<< 11389 1726854855.33259: stdout chunk (state=3): >>><<< 11389 1726854855.33349: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27482, "dev": 23, "nlink": 1, "atime": 1726854853.6892684, "mtime": 1726854853.6892684, "ctime": 1726854853.6892684, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854855.33451: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854855.0384717-11828-157382496175056/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854855.33603: _low_level_execute_command(): starting 11389 1726854855.33622: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854855.0384717-11828-157382496175056/ > /dev/null 2>&1 && sleep 0' 11389 1726854855.34838: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854855.34857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854855.34911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854855.34933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854855.35018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854855.35119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854855.37097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854855.37101: stdout chunk (state=3): >>><<< 11389 1726854855.37103: stderr chunk (state=3): >>><<< 11389 1726854855.37357: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854855.37361: handler run complete 11389 1726854855.37363: attempt loop complete, returning result 11389 1726854855.37392: _execute() done 11389 1726854855.37401: dumping result to json 11389 1726854855.37579: done dumping result, returning 11389 1726854855.37582: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 [0affcc66-ac2b-deb8-c119-000000000152] 11389 1726854855.37584: sending task result for task 0affcc66-ac2b-deb8-c119-000000000152 11389 1726854855.37659: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000152 11389 1726854855.37662: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726854853.6892684, "block_size": 4096, "blocks": 0, "ctime": 1726854853.6892684, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27482, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726854853.6892684, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11389 1726854855.37762: no more pending results, returning what we have 11389 1726854855.37766: results queue empty 11389 1726854855.37770: checking for any_errors_fatal 11389 1726854855.37771: done checking for any_errors_fatal 11389 1726854855.37772: checking for max_fail_percentage 11389 1726854855.37774: done checking for max_fail_percentage 11389 1726854855.37775: checking to see if all hosts have failed and the running result is not ok 11389 1726854855.37776: done checking to see if all hosts have failed 11389 1726854855.37777: getting the remaining hosts for this loop 11389 1726854855.37779: done getting the remaining hosts for this loop 11389 1726854855.37783: getting the next task for host managed_node3 11389 1726854855.37894: done getting next task for host managed_node3 11389 1726854855.37906: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11389 1726854855.37910: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854855.37915: getting variables 11389 1726854855.37917: in VariableManager get_vars() 11389 1726854855.37959: Calling all_inventory to load vars for managed_node3 11389 1726854855.37962: Calling groups_inventory to load vars for managed_node3 11389 1726854855.37965: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854855.37981: Calling all_plugins_play to load vars for managed_node3 11389 1726854855.37984: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854855.38344: Calling groups_plugins_play to load vars for managed_node3 11389 1726854855.38940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854855.39717: done with get_vars() 11389 1726854855.39729: done getting variables 11389 1726854855.40056: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 11389 1726854855.40596: variable 'interface' from source: task vars 11389 1726854855.40600: variable 'dhcp_interface1' from source: play vars 11389 1726854855.40720: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:54:15 -0400 (0:00:00.429) 0:00:07.831 ****** 11389 1726854855.40875: entering _queue_task() for managed_node3/assert 11389 1726854855.40877: Creating lock for assert 11389 1726854855.41704: worker is 1 (out of 1 available) 11389 1726854855.41716: exiting _queue_task() for managed_node3/assert 11389 1726854855.41726: done queuing things up, now waiting for results queue to drain 11389 1726854855.41728: waiting for pending results... 11389 1726854855.42506: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' 11389 1726854855.42593: in run() - task 0affcc66-ac2b-deb8-c119-000000000017 11389 1726854855.42599: variable 'ansible_search_path' from source: unknown 11389 1726854855.42601: variable 'ansible_search_path' from source: unknown 11389 1726854855.42603: calling self._execute() 11389 1726854855.42793: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854855.42796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854855.42799: variable 'omit' from source: magic vars 11389 1726854855.43564: variable 'ansible_distribution_major_version' from source: facts 11389 1726854855.43580: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854855.43735: variable 'omit' from source: magic vars 11389 1726854855.43739: variable 'omit' from source: magic vars 11389 1726854855.43888: variable 'interface' from source: task vars 11389 1726854855.43959: variable 'dhcp_interface1' from source: play vars 11389 1726854855.44024: variable 'dhcp_interface1' from source: play vars 11389 1726854855.44082: variable 'omit' from source: magic vars 11389 1726854855.44211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854855.44250: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854855.44495: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854855.44498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854855.44501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854855.44503: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854855.44506: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854855.44508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854855.44682: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854855.44700: Set connection var ansible_timeout to 10 11389 1726854855.44892: Set connection var ansible_connection to ssh 11389 1726854855.44895: Set connection var ansible_shell_type to sh 11389 1726854855.44897: Set connection var ansible_pipelining to False 11389 1726854855.44900: Set connection var ansible_shell_executable to /bin/sh 11389 1726854855.44902: variable 'ansible_shell_executable' from source: unknown 11389 1726854855.44903: variable 'ansible_connection' from source: unknown 11389 1726854855.44905: variable 'ansible_module_compression' from source: unknown 11389 1726854855.44907: variable 'ansible_shell_type' from source: unknown 11389 1726854855.44909: variable 'ansible_shell_executable' from source: unknown 11389 1726854855.44910: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854855.44912: variable 'ansible_pipelining' from source: unknown 11389 1726854855.44914: variable 'ansible_timeout' from source: unknown 11389 1726854855.44916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854855.45131: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854855.45159: variable 'omit' from source: magic vars 11389 1726854855.45363: starting attempt loop 11389 1726854855.45367: running the handler 11389 1726854855.45509: variable 'interface_stat' from source: set_fact 11389 1726854855.45533: Evaluated conditional (interface_stat.stat.exists): True 11389 1726854855.45544: handler run complete 11389 1726854855.45563: attempt loop complete, returning result 11389 1726854855.45585: _execute() done 11389 1726854855.45795: dumping result to json 11389 1726854855.45798: done dumping result, returning 11389 1726854855.45801: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' [0affcc66-ac2b-deb8-c119-000000000017] 11389 1726854855.45803: sending task result for task 0affcc66-ac2b-deb8-c119-000000000017 11389 1726854855.45872: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000017 11389 1726854855.45875: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11389 1726854855.45946: no more pending results, returning what we have 11389 1726854855.45950: results queue empty 11389 1726854855.45950: checking for any_errors_fatal 11389 1726854855.45958: done checking for any_errors_fatal 11389 1726854855.45959: checking for max_fail_percentage 11389 1726854855.45961: done checking for max_fail_percentage 11389 1726854855.45963: checking to see if all hosts have failed and the running result is not ok 11389 1726854855.45964: done checking to see if all hosts have failed 11389 1726854855.45965: getting the remaining hosts for this loop 11389 1726854855.45966: done getting the remaining hosts for this loop 11389 1726854855.45969: getting the next task for host managed_node3 11389 1726854855.45979: done getting next task for host managed_node3 11389 1726854855.45982: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11389 1726854855.45985: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854855.45990: getting variables 11389 1726854855.45992: in VariableManager get_vars() 11389 1726854855.46033: Calling all_inventory to load vars for managed_node3 11389 1726854855.46036: Calling groups_inventory to load vars for managed_node3 11389 1726854855.46039: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854855.46050: Calling all_plugins_play to load vars for managed_node3 11389 1726854855.46053: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854855.46056: Calling groups_plugins_play to load vars for managed_node3 11389 1726854855.46564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854855.47364: done with get_vars() 11389 1726854855.47379: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:54:15 -0400 (0:00:00.068) 0:00:07.900 ****** 11389 1726854855.47707: entering _queue_task() for managed_node3/include_tasks 11389 1726854855.48629: worker is 1 (out of 1 available) 11389 1726854855.48641: exiting _queue_task() for managed_node3/include_tasks 11389 1726854855.48766: done queuing things up, now waiting for results queue to drain 11389 1726854855.48771: waiting for pending results... 11389 1726854855.49107: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 11389 1726854855.49334: in run() - task 0affcc66-ac2b-deb8-c119-00000000001b 11389 1726854855.49338: variable 'ansible_search_path' from source: unknown 11389 1726854855.49418: variable 'ansible_search_path' from source: unknown 11389 1726854855.49422: calling self._execute() 11389 1726854855.49642: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854855.49659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854855.49856: variable 'omit' from source: magic vars 11389 1726854855.50994: variable 'ansible_distribution_major_version' from source: facts 11389 1726854855.50999: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854855.51001: _execute() done 11389 1726854855.51004: dumping result to json 11389 1726854855.51008: done dumping result, returning 11389 1726854855.51012: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcc66-ac2b-deb8-c119-00000000001b] 11389 1726854855.51014: sending task result for task 0affcc66-ac2b-deb8-c119-00000000001b 11389 1726854855.51154: no more pending results, returning what we have 11389 1726854855.51160: in VariableManager get_vars() 11389 1726854855.51219: Calling all_inventory to load vars for managed_node3 11389 1726854855.51223: Calling groups_inventory to load vars for managed_node3 11389 1726854855.51226: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854855.51240: Calling all_plugins_play to load vars for managed_node3 11389 1726854855.51243: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854855.51246: Calling groups_plugins_play to load vars for managed_node3 11389 1726854855.52070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854855.52684: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000001b 11389 1726854855.52690: WORKER PROCESS EXITING 11389 1726854855.53656: done with get_vars() 11389 1726854855.53665: variable 'ansible_search_path' from source: unknown 11389 1726854855.53666: variable 'ansible_search_path' from source: unknown 11389 1726854855.53919: we have included files to process 11389 1726854855.53920: generating all_blocks data 11389 1726854855.53922: done generating all_blocks data 11389 1726854855.53925: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11389 1726854855.53926: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11389 1726854855.53929: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11389 1726854855.54611: done processing included file 11389 1726854855.54613: iterating over new_blocks loaded from include file 11389 1726854855.54615: in VariableManager get_vars() 11389 1726854855.54638: done with get_vars() 11389 1726854855.54640: filtering new block on tags 11389 1726854855.54777: done filtering new block on tags 11389 1726854855.54780: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 11389 1726854855.54883: extending task lists for all hosts with included blocks 11389 1726854855.55205: done extending task lists 11389 1726854855.55207: done processing included files 11389 1726854855.55207: results queue empty 11389 1726854855.55208: checking for any_errors_fatal 11389 1726854855.55211: done checking for any_errors_fatal 11389 1726854855.55212: checking for max_fail_percentage 11389 1726854855.55214: done checking for max_fail_percentage 11389 1726854855.55215: checking to see if all hosts have failed and the running result is not ok 11389 1726854855.55216: done checking to see if all hosts have failed 11389 1726854855.55216: getting the remaining hosts for this loop 11389 1726854855.55217: done getting the remaining hosts for this loop 11389 1726854855.55336: getting the next task for host managed_node3 11389 1726854855.55342: done getting next task for host managed_node3 11389 1726854855.55345: ^ task is: TASK: Get stat for interface {{ interface }} 11389 1726854855.55348: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854855.55351: getting variables 11389 1726854855.55352: in VariableManager get_vars() 11389 1726854855.55373: Calling all_inventory to load vars for managed_node3 11389 1726854855.55376: Calling groups_inventory to load vars for managed_node3 11389 1726854855.55378: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854855.55384: Calling all_plugins_play to load vars for managed_node3 11389 1726854855.55386: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854855.55392: Calling groups_plugins_play to load vars for managed_node3 11389 1726854855.55934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854855.56384: done with get_vars() 11389 1726854855.56397: done getting variables 11389 1726854855.56872: variable 'interface' from source: task vars 11389 1726854855.56876: variable 'dhcp_interface2' from source: play vars 11389 1726854855.57140: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:54:15 -0400 (0:00:00.095) 0:00:07.995 ****** 11389 1726854855.57217: entering _queue_task() for managed_node3/stat 11389 1726854855.57995: worker is 1 (out of 1 available) 11389 1726854855.58072: exiting _queue_task() for managed_node3/stat 11389 1726854855.58085: done queuing things up, now waiting for results queue to drain 11389 1726854855.58086: waiting for pending results... 11389 1726854855.58356: running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 11389 1726854855.58502: in run() - task 0affcc66-ac2b-deb8-c119-00000000016a 11389 1726854855.58523: variable 'ansible_search_path' from source: unknown 11389 1726854855.58531: variable 'ansible_search_path' from source: unknown 11389 1726854855.58575: calling self._execute() 11389 1726854855.58679: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854855.58693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854855.58717: variable 'omit' from source: magic vars 11389 1726854855.59322: variable 'ansible_distribution_major_version' from source: facts 11389 1726854855.59340: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854855.59358: variable 'omit' from source: magic vars 11389 1726854855.59413: variable 'omit' from source: magic vars 11389 1726854855.59525: variable 'interface' from source: task vars 11389 1726854855.59540: variable 'dhcp_interface2' from source: play vars 11389 1726854855.59614: variable 'dhcp_interface2' from source: play vars 11389 1726854855.59636: variable 'omit' from source: magic vars 11389 1726854855.59692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854855.59731: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854855.59760: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854855.59783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854855.59839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854855.59901: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854855.59924: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854855.59933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854855.60164: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854855.60177: Set connection var ansible_timeout to 10 11389 1726854855.60184: Set connection var ansible_connection to ssh 11389 1726854855.60497: Set connection var ansible_shell_type to sh 11389 1726854855.60501: Set connection var ansible_pipelining to False 11389 1726854855.60503: Set connection var ansible_shell_executable to /bin/sh 11389 1726854855.60553: variable 'ansible_shell_executable' from source: unknown 11389 1726854855.60567: variable 'ansible_connection' from source: unknown 11389 1726854855.60611: variable 'ansible_module_compression' from source: unknown 11389 1726854855.60620: variable 'ansible_shell_type' from source: unknown 11389 1726854855.60897: variable 'ansible_shell_executable' from source: unknown 11389 1726854855.60900: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854855.60902: variable 'ansible_pipelining' from source: unknown 11389 1726854855.60905: variable 'ansible_timeout' from source: unknown 11389 1726854855.60907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854855.61078: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11389 1726854855.61201: variable 'omit' from source: magic vars 11389 1726854855.61211: starting attempt loop 11389 1726854855.61225: running the handler 11389 1726854855.61247: _low_level_execute_command(): starting 11389 1726854855.61259: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854855.62775: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854855.62823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854855.62842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854855.63019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854855.63122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854855.64817: stdout chunk (state=3): >>>/root <<< 11389 1726854855.65018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854855.65031: stdout chunk (state=3): >>><<< 11389 1726854855.65054: stderr chunk (state=3): >>><<< 11389 1726854855.65273: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854855.65278: _low_level_execute_command(): starting 11389 1726854855.65281: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854855.6517391-11856-9515501435881 `" && echo ansible-tmp-1726854855.6517391-11856-9515501435881="` echo /root/.ansible/tmp/ansible-tmp-1726854855.6517391-11856-9515501435881 `" ) && sleep 0' 11389 1726854855.66510: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854855.66712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854855.66905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854855.66938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854855.68845: stdout chunk (state=3): >>>ansible-tmp-1726854855.6517391-11856-9515501435881=/root/.ansible/tmp/ansible-tmp-1726854855.6517391-11856-9515501435881 <<< 11389 1726854855.68949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854855.68972: stderr chunk (state=3): >>><<< 11389 1726854855.68979: stdout chunk (state=3): >>><<< 11389 1726854855.69001: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854855.6517391-11856-9515501435881=/root/.ansible/tmp/ansible-tmp-1726854855.6517391-11856-9515501435881 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854855.69093: variable 'ansible_module_compression' from source: unknown 11389 1726854855.69096: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11389 1726854855.69127: variable 'ansible_facts' from source: unknown 11389 1726854855.69183: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854855.6517391-11856-9515501435881/AnsiballZ_stat.py 11389 1726854855.69283: Sending initial data 11389 1726854855.69286: Sent initial data (151 bytes) 11389 1726854855.69691: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854855.69727: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854855.69730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854855.69733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854855.69735: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854855.69737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854855.69801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854855.69804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854855.69850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854855.71504: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854855.71536: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854855.71617: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmpkjye_ift /root/.ansible/tmp/ansible-tmp-1726854855.6517391-11856-9515501435881/AnsiballZ_stat.py <<< 11389 1726854855.71620: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854855.6517391-11856-9515501435881/AnsiballZ_stat.py" <<< 11389 1726854855.71658: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmpkjye_ift" to remote "/root/.ansible/tmp/ansible-tmp-1726854855.6517391-11856-9515501435881/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854855.6517391-11856-9515501435881/AnsiballZ_stat.py" <<< 11389 1726854855.72567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854855.72591: stderr chunk (state=3): >>><<< 11389 1726854855.72594: stdout chunk (state=3): >>><<< 11389 1726854855.72643: done transferring module to remote 11389 1726854855.72669: _low_level_execute_command(): starting 11389 1726854855.72673: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854855.6517391-11856-9515501435881/ /root/.ansible/tmp/ansible-tmp-1726854855.6517391-11856-9515501435881/AnsiballZ_stat.py && sleep 0' 11389 1726854855.73200: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854855.73252: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854855.73256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854855.73259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854855.73261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854855.73263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854855.73265: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854855.73267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854855.73269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854855.73323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854855.73331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854855.73341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854855.73412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854855.75194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854855.75215: stderr chunk (state=3): >>><<< 11389 1726854855.75218: stdout chunk (state=3): >>><<< 11389 1726854855.75240: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854855.75245: _low_level_execute_command(): starting 11389 1726854855.75254: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854855.6517391-11856-9515501435881/AnsiballZ_stat.py && sleep 0' 11389 1726854855.76132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854855.76137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854855.76140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854855.76146: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 11389 1726854855.76149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854855.76212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854855.76215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854855.76352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854855.91348: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27888, "dev": 23, "nlink": 1, "atime": 1726854853.6956015, "mtime": 1726854853.6956015, "ctime": 1726854853.6956015, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11389 1726854855.92683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854855.92713: stderr chunk (state=3): >>><<< 11389 1726854855.92716: stdout chunk (state=3): >>><<< 11389 1726854855.92743: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27888, "dev": 23, "nlink": 1, "atime": 1726854853.6956015, "mtime": 1726854853.6956015, "ctime": 1726854853.6956015, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854855.92804: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854855.6517391-11856-9515501435881/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854855.92808: _low_level_execute_command(): starting 11389 1726854855.92810: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854855.6517391-11856-9515501435881/ > /dev/null 2>&1 && sleep 0' 11389 1726854855.93407: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854855.93415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854855.93419: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854855.93448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854855.93455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854855.93546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854855.95433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854855.95451: stderr chunk (state=3): >>><<< 11389 1726854855.95454: stdout chunk (state=3): >>><<< 11389 1726854855.95476: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854855.95479: handler run complete 11389 1726854855.95527: attempt loop complete, returning result 11389 1726854855.95530: _execute() done 11389 1726854855.95533: dumping result to json 11389 1726854855.95537: done dumping result, returning 11389 1726854855.95546: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 [0affcc66-ac2b-deb8-c119-00000000016a] 11389 1726854855.95551: sending task result for task 0affcc66-ac2b-deb8-c119-00000000016a 11389 1726854855.95664: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000016a 11389 1726854855.95669: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726854853.6956015, "block_size": 4096, "blocks": 0, "ctime": 1726854853.6956015, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27888, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726854853.6956015, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11389 1726854855.95782: no more pending results, returning what we have 11389 1726854855.95793: results queue empty 11389 1726854855.95794: checking for any_errors_fatal 11389 1726854855.95795: done checking for any_errors_fatal 11389 1726854855.95799: checking for max_fail_percentage 11389 1726854855.95800: done checking for max_fail_percentage 11389 1726854855.95801: checking to see if all hosts have failed and the running result is not ok 11389 1726854855.95802: done checking to see if all hosts have failed 11389 1726854855.95803: getting the remaining hosts for this loop 11389 1726854855.95804: done getting the remaining hosts for this loop 11389 1726854855.95807: getting the next task for host managed_node3 11389 1726854855.95817: done getting next task for host managed_node3 11389 1726854855.95819: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11389 1726854855.95822: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854855.95825: getting variables 11389 1726854855.95826: in VariableManager get_vars() 11389 1726854855.95861: Calling all_inventory to load vars for managed_node3 11389 1726854855.95864: Calling groups_inventory to load vars for managed_node3 11389 1726854855.95866: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854855.95877: Calling all_plugins_play to load vars for managed_node3 11389 1726854855.95879: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854855.95881: Calling groups_plugins_play to load vars for managed_node3 11389 1726854855.96121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854855.96240: done with get_vars() 11389 1726854855.96248: done getting variables 11389 1726854855.96294: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854855.96384: variable 'interface' from source: task vars 11389 1726854855.96388: variable 'dhcp_interface2' from source: play vars 11389 1726854855.96429: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:54:15 -0400 (0:00:00.392) 0:00:08.387 ****** 11389 1726854855.96452: entering _queue_task() for managed_node3/assert 11389 1726854855.96675: worker is 1 (out of 1 available) 11389 1726854855.96685: exiting _queue_task() for managed_node3/assert 11389 1726854855.96897: done queuing things up, now waiting for results queue to drain 11389 1726854855.96900: waiting for pending results... 11389 1726854855.97410: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' 11389 1726854855.97415: in run() - task 0affcc66-ac2b-deb8-c119-00000000001c 11389 1726854855.97418: variable 'ansible_search_path' from source: unknown 11389 1726854855.97420: variable 'ansible_search_path' from source: unknown 11389 1726854855.97422: calling self._execute() 11389 1726854855.97425: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854855.97427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854855.97430: variable 'omit' from source: magic vars 11389 1726854855.97537: variable 'ansible_distribution_major_version' from source: facts 11389 1726854855.97548: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854855.97554: variable 'omit' from source: magic vars 11389 1726854855.97611: variable 'omit' from source: magic vars 11389 1726854855.97749: variable 'interface' from source: task vars 11389 1726854855.97753: variable 'dhcp_interface2' from source: play vars 11389 1726854855.97793: variable 'dhcp_interface2' from source: play vars 11389 1726854855.97797: variable 'omit' from source: magic vars 11389 1726854855.97818: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854855.97858: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854855.97873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854855.97894: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854855.97907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854855.97940: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854855.97944: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854855.97947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854855.98040: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854855.98048: Set connection var ansible_timeout to 10 11389 1726854855.98051: Set connection var ansible_connection to ssh 11389 1726854855.98054: Set connection var ansible_shell_type to sh 11389 1726854855.98059: Set connection var ansible_pipelining to False 11389 1726854855.98064: Set connection var ansible_shell_executable to /bin/sh 11389 1726854855.98084: variable 'ansible_shell_executable' from source: unknown 11389 1726854855.98090: variable 'ansible_connection' from source: unknown 11389 1726854855.98093: variable 'ansible_module_compression' from source: unknown 11389 1726854855.98095: variable 'ansible_shell_type' from source: unknown 11389 1726854855.98097: variable 'ansible_shell_executable' from source: unknown 11389 1726854855.98105: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854855.98109: variable 'ansible_pipelining' from source: unknown 11389 1726854855.98111: variable 'ansible_timeout' from source: unknown 11389 1726854855.98116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854855.98251: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854855.98303: variable 'omit' from source: magic vars 11389 1726854855.98307: starting attempt loop 11389 1726854855.98309: running the handler 11389 1726854855.98403: variable 'interface_stat' from source: set_fact 11389 1726854855.98437: Evaluated conditional (interface_stat.stat.exists): True 11389 1726854855.98440: handler run complete 11389 1726854855.98443: attempt loop complete, returning result 11389 1726854855.98446: _execute() done 11389 1726854855.98448: dumping result to json 11389 1726854855.98451: done dumping result, returning 11389 1726854855.98453: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' [0affcc66-ac2b-deb8-c119-00000000001c] 11389 1726854855.98500: sending task result for task 0affcc66-ac2b-deb8-c119-00000000001c 11389 1726854855.98645: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000001c 11389 1726854855.98688: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11389 1726854855.98725: no more pending results, returning what we have 11389 1726854855.98728: results queue empty 11389 1726854855.98729: checking for any_errors_fatal 11389 1726854855.98734: done checking for any_errors_fatal 11389 1726854855.98735: checking for max_fail_percentage 11389 1726854855.98736: done checking for max_fail_percentage 11389 1726854855.98737: checking to see if all hosts have failed and the running result is not ok 11389 1726854855.98738: done checking to see if all hosts have failed 11389 1726854855.98739: getting the remaining hosts for this loop 11389 1726854855.98740: done getting the remaining hosts for this loop 11389 1726854855.98742: getting the next task for host managed_node3 11389 1726854855.98747: done getting next task for host managed_node3 11389 1726854855.98748: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 11389 1726854855.98750: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854855.98752: getting variables 11389 1726854855.98753: in VariableManager get_vars() 11389 1726854855.98780: Calling all_inventory to load vars for managed_node3 11389 1726854855.98782: Calling groups_inventory to load vars for managed_node3 11389 1726854855.98784: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854855.98792: Calling all_plugins_play to load vars for managed_node3 11389 1726854855.98794: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854855.98796: Calling groups_plugins_play to load vars for managed_node3 11389 1726854855.98902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854855.99024: done with get_vars() 11389 1726854855.99032: done getting variables 11389 1726854855.99073: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:28 Friday 20 September 2024 13:54:15 -0400 (0:00:00.026) 0:00:08.414 ****** 11389 1726854855.99095: entering _queue_task() for managed_node3/command 11389 1726854855.99276: worker is 1 (out of 1 available) 11389 1726854855.99289: exiting _queue_task() for managed_node3/command 11389 1726854855.99299: done queuing things up, now waiting for results queue to drain 11389 1726854855.99301: waiting for pending results... 11389 1726854855.99457: running TaskExecutor() for managed_node3/TASK: Backup the /etc/resolv.conf for initscript 11389 1726854855.99513: in run() - task 0affcc66-ac2b-deb8-c119-00000000001d 11389 1726854855.99523: variable 'ansible_search_path' from source: unknown 11389 1726854855.99555: calling self._execute() 11389 1726854855.99620: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854855.99623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854855.99632: variable 'omit' from source: magic vars 11389 1726854855.99947: variable 'ansible_distribution_major_version' from source: facts 11389 1726854855.99954: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854856.00035: variable 'network_provider' from source: set_fact 11389 1726854856.00039: Evaluated conditional (network_provider == "initscripts"): False 11389 1726854856.00041: when evaluation is False, skipping this task 11389 1726854856.00044: _execute() done 11389 1726854856.00046: dumping result to json 11389 1726854856.00049: done dumping result, returning 11389 1726854856.00056: done running TaskExecutor() for managed_node3/TASK: Backup the /etc/resolv.conf for initscript [0affcc66-ac2b-deb8-c119-00000000001d] 11389 1726854856.00060: sending task result for task 0affcc66-ac2b-deb8-c119-00000000001d 11389 1726854856.00142: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000001d 11389 1726854856.00145: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11389 1726854856.00221: no more pending results, returning what we have 11389 1726854856.00224: results queue empty 11389 1726854856.00225: checking for any_errors_fatal 11389 1726854856.00229: done checking for any_errors_fatal 11389 1726854856.00230: checking for max_fail_percentage 11389 1726854856.00231: done checking for max_fail_percentage 11389 1726854856.00232: checking to see if all hosts have failed and the running result is not ok 11389 1726854856.00233: done checking to see if all hosts have failed 11389 1726854856.00233: getting the remaining hosts for this loop 11389 1726854856.00234: done getting the remaining hosts for this loop 11389 1726854856.00237: getting the next task for host managed_node3 11389 1726854856.00241: done getting next task for host managed_node3 11389 1726854856.00243: ^ task is: TASK: TEST Add Bond with 2 ports 11389 1726854856.00245: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854856.00248: getting variables 11389 1726854856.00249: in VariableManager get_vars() 11389 1726854856.00276: Calling all_inventory to load vars for managed_node3 11389 1726854856.00277: Calling groups_inventory to load vars for managed_node3 11389 1726854856.00279: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854856.00285: Calling all_plugins_play to load vars for managed_node3 11389 1726854856.00289: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854856.00292: Calling groups_plugins_play to load vars for managed_node3 11389 1726854856.00431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854856.00542: done with get_vars() 11389 1726854856.00548: done getting variables 11389 1726854856.00585: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:33 Friday 20 September 2024 13:54:16 -0400 (0:00:00.015) 0:00:08.429 ****** 11389 1726854856.00604: entering _queue_task() for managed_node3/debug 11389 1726854856.00768: worker is 1 (out of 1 available) 11389 1726854856.00780: exiting _queue_task() for managed_node3/debug 11389 1726854856.00790: done queuing things up, now waiting for results queue to drain 11389 1726854856.00792: waiting for pending results... 11389 1726854856.00928: running TaskExecutor() for managed_node3/TASK: TEST Add Bond with 2 ports 11389 1726854856.00981: in run() - task 0affcc66-ac2b-deb8-c119-00000000001e 11389 1726854856.00994: variable 'ansible_search_path' from source: unknown 11389 1726854856.01025: calling self._execute() 11389 1726854856.01080: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854856.01083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854856.01095: variable 'omit' from source: magic vars 11389 1726854856.01500: variable 'ansible_distribution_major_version' from source: facts 11389 1726854856.01503: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854856.01505: variable 'omit' from source: magic vars 11389 1726854856.01507: variable 'omit' from source: magic vars 11389 1726854856.01508: variable 'omit' from source: magic vars 11389 1726854856.01513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854856.01546: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854856.01567: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854856.01586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854856.01615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854856.01647: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854856.01655: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854856.01663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854856.01777: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854856.01794: Set connection var ansible_timeout to 10 11389 1726854856.01802: Set connection var ansible_connection to ssh 11389 1726854856.01812: Set connection var ansible_shell_type to sh 11389 1726854856.01833: Set connection var ansible_pipelining to False 11389 1726854856.01844: Set connection var ansible_shell_executable to /bin/sh 11389 1726854856.01868: variable 'ansible_shell_executable' from source: unknown 11389 1726854856.01877: variable 'ansible_connection' from source: unknown 11389 1726854856.01884: variable 'ansible_module_compression' from source: unknown 11389 1726854856.01893: variable 'ansible_shell_type' from source: unknown 11389 1726854856.01933: variable 'ansible_shell_executable' from source: unknown 11389 1726854856.01935: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854856.01937: variable 'ansible_pipelining' from source: unknown 11389 1726854856.01939: variable 'ansible_timeout' from source: unknown 11389 1726854856.01941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854856.02051: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854856.02064: variable 'omit' from source: magic vars 11389 1726854856.02073: starting attempt loop 11389 1726854856.02151: running the handler 11389 1726854856.02154: handler run complete 11389 1726854856.02157: attempt loop complete, returning result 11389 1726854856.02159: _execute() done 11389 1726854856.02161: dumping result to json 11389 1726854856.02169: done dumping result, returning 11389 1726854856.02179: done running TaskExecutor() for managed_node3/TASK: TEST Add Bond with 2 ports [0affcc66-ac2b-deb8-c119-00000000001e] 11389 1726854856.02191: sending task result for task 0affcc66-ac2b-deb8-c119-00000000001e 11389 1726854856.02367: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000001e 11389 1726854856.02371: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 11389 1726854856.02419: no more pending results, returning what we have 11389 1726854856.02423: results queue empty 11389 1726854856.02423: checking for any_errors_fatal 11389 1726854856.02428: done checking for any_errors_fatal 11389 1726854856.02428: checking for max_fail_percentage 11389 1726854856.02431: done checking for max_fail_percentage 11389 1726854856.02432: checking to see if all hosts have failed and the running result is not ok 11389 1726854856.02433: done checking to see if all hosts have failed 11389 1726854856.02433: getting the remaining hosts for this loop 11389 1726854856.02434: done getting the remaining hosts for this loop 11389 1726854856.02437: getting the next task for host managed_node3 11389 1726854856.02445: done getting next task for host managed_node3 11389 1726854856.02450: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11389 1726854856.02453: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854856.02469: getting variables 11389 1726854856.02472: in VariableManager get_vars() 11389 1726854856.02511: Calling all_inventory to load vars for managed_node3 11389 1726854856.02514: Calling groups_inventory to load vars for managed_node3 11389 1726854856.02517: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854856.02528: Calling all_plugins_play to load vars for managed_node3 11389 1726854856.02531: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854856.02534: Calling groups_plugins_play to load vars for managed_node3 11389 1726854856.02822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854856.03073: done with get_vars() 11389 1726854856.03084: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:54:16 -0400 (0:00:00.025) 0:00:08.454 ****** 11389 1726854856.03183: entering _queue_task() for managed_node3/include_tasks 11389 1726854856.03500: worker is 1 (out of 1 available) 11389 1726854856.03513: exiting _queue_task() for managed_node3/include_tasks 11389 1726854856.03525: done queuing things up, now waiting for results queue to drain 11389 1726854856.03526: waiting for pending results... 11389 1726854856.03806: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11389 1726854856.03870: in run() - task 0affcc66-ac2b-deb8-c119-000000000026 11389 1726854856.03900: variable 'ansible_search_path' from source: unknown 11389 1726854856.03912: variable 'ansible_search_path' from source: unknown 11389 1726854856.03956: calling self._execute() 11389 1726854856.04052: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854856.04064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854856.04079: variable 'omit' from source: magic vars 11389 1726854856.04497: variable 'ansible_distribution_major_version' from source: facts 11389 1726854856.04501: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854856.04734: _execute() done 11389 1726854856.04738: dumping result to json 11389 1726854856.04741: done dumping result, returning 11389 1726854856.04744: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-deb8-c119-000000000026] 11389 1726854856.04746: sending task result for task 0affcc66-ac2b-deb8-c119-000000000026 11389 1726854856.04906: no more pending results, returning what we have 11389 1726854856.04911: in VariableManager get_vars() 11389 1726854856.04960: Calling all_inventory to load vars for managed_node3 11389 1726854856.04963: Calling groups_inventory to load vars for managed_node3 11389 1726854856.04966: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854856.04978: Calling all_plugins_play to load vars for managed_node3 11389 1726854856.04981: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854856.04985: Calling groups_plugins_play to load vars for managed_node3 11389 1726854856.05381: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000026 11389 1726854856.05385: WORKER PROCESS EXITING 11389 1726854856.05420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854856.05671: done with get_vars() 11389 1726854856.05681: variable 'ansible_search_path' from source: unknown 11389 1726854856.05682: variable 'ansible_search_path' from source: unknown 11389 1726854856.05726: we have included files to process 11389 1726854856.05727: generating all_blocks data 11389 1726854856.05729: done generating all_blocks data 11389 1726854856.05733: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11389 1726854856.05735: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11389 1726854856.05737: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11389 1726854856.06469: done processing included file 11389 1726854856.06471: iterating over new_blocks loaded from include file 11389 1726854856.06479: in VariableManager get_vars() 11389 1726854856.06531: done with get_vars() 11389 1726854856.06533: filtering new block on tags 11389 1726854856.06551: done filtering new block on tags 11389 1726854856.06553: in VariableManager get_vars() 11389 1726854856.06576: done with get_vars() 11389 1726854856.06578: filtering new block on tags 11389 1726854856.06600: done filtering new block on tags 11389 1726854856.06603: in VariableManager get_vars() 11389 1726854856.06853: done with get_vars() 11389 1726854856.06855: filtering new block on tags 11389 1726854856.06873: done filtering new block on tags 11389 1726854856.06875: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 11389 1726854856.06880: extending task lists for all hosts with included blocks 11389 1726854856.08804: done extending task lists 11389 1726854856.08806: done processing included files 11389 1726854856.08807: results queue empty 11389 1726854856.08807: checking for any_errors_fatal 11389 1726854856.08811: done checking for any_errors_fatal 11389 1726854856.08812: checking for max_fail_percentage 11389 1726854856.08813: done checking for max_fail_percentage 11389 1726854856.08814: checking to see if all hosts have failed and the running result is not ok 11389 1726854856.08815: done checking to see if all hosts have failed 11389 1726854856.08815: getting the remaining hosts for this loop 11389 1726854856.08816: done getting the remaining hosts for this loop 11389 1726854856.08819: getting the next task for host managed_node3 11389 1726854856.08827: done getting next task for host managed_node3 11389 1726854856.08830: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11389 1726854856.08832: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854856.08842: getting variables 11389 1726854856.08843: in VariableManager get_vars() 11389 1726854856.08863: Calling all_inventory to load vars for managed_node3 11389 1726854856.08865: Calling groups_inventory to load vars for managed_node3 11389 1726854856.08867: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854856.08872: Calling all_plugins_play to load vars for managed_node3 11389 1726854856.08875: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854856.08877: Calling groups_plugins_play to load vars for managed_node3 11389 1726854856.09351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854856.09778: done with get_vars() 11389 1726854856.09800: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:54:16 -0400 (0:00:00.066) 0:00:08.521 ****** 11389 1726854856.09880: entering _queue_task() for managed_node3/setup 11389 1726854856.10702: worker is 1 (out of 1 available) 11389 1726854856.10711: exiting _queue_task() for managed_node3/setup 11389 1726854856.10721: done queuing things up, now waiting for results queue to drain 11389 1726854856.10723: waiting for pending results... 11389 1726854856.11021: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11389 1726854856.11257: in run() - task 0affcc66-ac2b-deb8-c119-000000000188 11389 1726854856.11270: variable 'ansible_search_path' from source: unknown 11389 1726854856.11275: variable 'ansible_search_path' from source: unknown 11389 1726854856.11414: calling self._execute() 11389 1726854856.11611: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854856.11616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854856.11628: variable 'omit' from source: magic vars 11389 1726854856.12254: variable 'ansible_distribution_major_version' from source: facts 11389 1726854856.12266: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854856.12748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854856.15396: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854856.15493: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854856.15585: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854856.15721: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854856.15724: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854856.15954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854856.15995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854856.16075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854856.16126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854856.16402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854856.16509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854856.16514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854856.16516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854856.16543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854856.16563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854856.16878: variable '__network_required_facts' from source: role '' defaults 11389 1726854856.16962: variable 'ansible_facts' from source: unknown 11389 1726854856.17284: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11389 1726854856.17289: when evaluation is False, skipping this task 11389 1726854856.17291: _execute() done 11389 1726854856.17293: dumping result to json 11389 1726854856.17295: done dumping result, returning 11389 1726854856.17297: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-deb8-c119-000000000188] 11389 1726854856.17299: sending task result for task 0affcc66-ac2b-deb8-c119-000000000188 11389 1726854856.17359: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000188 11389 1726854856.17361: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11389 1726854856.17422: no more pending results, returning what we have 11389 1726854856.17425: results queue empty 11389 1726854856.17426: checking for any_errors_fatal 11389 1726854856.17427: done checking for any_errors_fatal 11389 1726854856.17428: checking for max_fail_percentage 11389 1726854856.17429: done checking for max_fail_percentage 11389 1726854856.17430: checking to see if all hosts have failed and the running result is not ok 11389 1726854856.17431: done checking to see if all hosts have failed 11389 1726854856.17432: getting the remaining hosts for this loop 11389 1726854856.17433: done getting the remaining hosts for this loop 11389 1726854856.17438: getting the next task for host managed_node3 11389 1726854856.17447: done getting next task for host managed_node3 11389 1726854856.17452: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11389 1726854856.17457: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854856.17472: getting variables 11389 1726854856.17475: in VariableManager get_vars() 11389 1726854856.17527: Calling all_inventory to load vars for managed_node3 11389 1726854856.17530: Calling groups_inventory to load vars for managed_node3 11389 1726854856.17532: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854856.17544: Calling all_plugins_play to load vars for managed_node3 11389 1726854856.17547: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854856.17550: Calling groups_plugins_play to load vars for managed_node3 11389 1726854856.18217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854856.18770: done with get_vars() 11389 1726854856.18902: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:54:16 -0400 (0:00:00.091) 0:00:08.612 ****** 11389 1726854856.19094: entering _queue_task() for managed_node3/stat 11389 1726854856.19683: worker is 1 (out of 1 available) 11389 1726854856.19701: exiting _queue_task() for managed_node3/stat 11389 1726854856.19712: done queuing things up, now waiting for results queue to drain 11389 1726854856.19713: waiting for pending results... 11389 1726854856.20035: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 11389 1726854856.20200: in run() - task 0affcc66-ac2b-deb8-c119-00000000018a 11389 1726854856.20204: variable 'ansible_search_path' from source: unknown 11389 1726854856.20207: variable 'ansible_search_path' from source: unknown 11389 1726854856.20523: calling self._execute() 11389 1726854856.20527: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854856.20530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854856.20533: variable 'omit' from source: magic vars 11389 1726854856.20870: variable 'ansible_distribution_major_version' from source: facts 11389 1726854856.20873: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854856.21034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854856.21503: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854856.21507: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854856.21510: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854856.21512: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854856.21692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854856.21696: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854856.21699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854856.21701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854856.21743: variable '__network_is_ostree' from source: set_fact 11389 1726854856.21750: Evaluated conditional (not __network_is_ostree is defined): False 11389 1726854856.21753: when evaluation is False, skipping this task 11389 1726854856.21755: _execute() done 11389 1726854856.21758: dumping result to json 11389 1726854856.21760: done dumping result, returning 11389 1726854856.21769: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-deb8-c119-00000000018a] 11389 1726854856.21777: sending task result for task 0affcc66-ac2b-deb8-c119-00000000018a 11389 1726854856.21927: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000018a skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11389 1726854856.22084: no more pending results, returning what we have 11389 1726854856.22088: results queue empty 11389 1726854856.22089: checking for any_errors_fatal 11389 1726854856.22094: done checking for any_errors_fatal 11389 1726854856.22094: checking for max_fail_percentage 11389 1726854856.22096: done checking for max_fail_percentage 11389 1726854856.22097: checking to see if all hosts have failed and the running result is not ok 11389 1726854856.22098: done checking to see if all hosts have failed 11389 1726854856.22099: getting the remaining hosts for this loop 11389 1726854856.22100: done getting the remaining hosts for this loop 11389 1726854856.22103: getting the next task for host managed_node3 11389 1726854856.22109: done getting next task for host managed_node3 11389 1726854856.22112: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11389 1726854856.22116: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854856.22129: getting variables 11389 1726854856.22130: in VariableManager get_vars() 11389 1726854856.22168: Calling all_inventory to load vars for managed_node3 11389 1726854856.22171: Calling groups_inventory to load vars for managed_node3 11389 1726854856.22173: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854856.22181: Calling all_plugins_play to load vars for managed_node3 11389 1726854856.22184: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854856.22189: Calling groups_plugins_play to load vars for managed_node3 11389 1726854856.22607: WORKER PROCESS EXITING 11389 1726854856.22638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854856.22937: done with get_vars() 11389 1726854856.23172: done getting variables 11389 1726854856.23229: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:54:16 -0400 (0:00:00.042) 0:00:08.655 ****** 11389 1726854856.23263: entering _queue_task() for managed_node3/set_fact 11389 1726854856.23884: worker is 1 (out of 1 available) 11389 1726854856.24050: exiting _queue_task() for managed_node3/set_fact 11389 1726854856.24060: done queuing things up, now waiting for results queue to drain 11389 1726854856.24062: waiting for pending results... 11389 1726854856.24370: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11389 1726854856.24896: in run() - task 0affcc66-ac2b-deb8-c119-00000000018b 11389 1726854856.24901: variable 'ansible_search_path' from source: unknown 11389 1726854856.24904: variable 'ansible_search_path' from source: unknown 11389 1726854856.24907: calling self._execute() 11389 1726854856.25092: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854856.25117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854856.25160: variable 'omit' from source: magic vars 11389 1726854856.25907: variable 'ansible_distribution_major_version' from source: facts 11389 1726854856.26091: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854856.26278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854856.26996: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854856.27033: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854856.27212: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854856.27223: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854856.27441: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854856.27474: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854856.27538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854856.27624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854856.27836: variable '__network_is_ostree' from source: set_fact 11389 1726854856.27849: Evaluated conditional (not __network_is_ostree is defined): False 11389 1726854856.27859: when evaluation is False, skipping this task 11389 1726854856.27871: _execute() done 11389 1726854856.27878: dumping result to json 11389 1726854856.27886: done dumping result, returning 11389 1726854856.28056: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-deb8-c119-00000000018b] 11389 1726854856.28059: sending task result for task 0affcc66-ac2b-deb8-c119-00000000018b 11389 1726854856.28130: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000018b 11389 1726854856.28133: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11389 1726854856.28211: no more pending results, returning what we have 11389 1726854856.28216: results queue empty 11389 1726854856.28216: checking for any_errors_fatal 11389 1726854856.28223: done checking for any_errors_fatal 11389 1726854856.28223: checking for max_fail_percentage 11389 1726854856.28225: done checking for max_fail_percentage 11389 1726854856.28226: checking to see if all hosts have failed and the running result is not ok 11389 1726854856.28228: done checking to see if all hosts have failed 11389 1726854856.28228: getting the remaining hosts for this loop 11389 1726854856.28230: done getting the remaining hosts for this loop 11389 1726854856.28234: getting the next task for host managed_node3 11389 1726854856.28245: done getting next task for host managed_node3 11389 1726854856.28249: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11389 1726854856.28254: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854856.28271: getting variables 11389 1726854856.28273: in VariableManager get_vars() 11389 1726854856.28322: Calling all_inventory to load vars for managed_node3 11389 1726854856.28325: Calling groups_inventory to load vars for managed_node3 11389 1726854856.28328: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854856.28339: Calling all_plugins_play to load vars for managed_node3 11389 1726854856.28342: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854856.28345: Calling groups_plugins_play to load vars for managed_node3 11389 1726854856.29051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854856.29436: done with get_vars() 11389 1726854856.29447: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:54:16 -0400 (0:00:00.065) 0:00:08.721 ****** 11389 1726854856.29861: entering _queue_task() for managed_node3/service_facts 11389 1726854856.29862: Creating lock for service_facts 11389 1726854856.30666: worker is 1 (out of 1 available) 11389 1726854856.30775: exiting _queue_task() for managed_node3/service_facts 11389 1726854856.30791: done queuing things up, now waiting for results queue to drain 11389 1726854856.30793: waiting for pending results... 11389 1726854856.31304: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 11389 1726854856.31309: in run() - task 0affcc66-ac2b-deb8-c119-00000000018d 11389 1726854856.31312: variable 'ansible_search_path' from source: unknown 11389 1726854856.31315: variable 'ansible_search_path' from source: unknown 11389 1726854856.31318: calling self._execute() 11389 1726854856.31354: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854856.31358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854856.31368: variable 'omit' from source: magic vars 11389 1726854856.31734: variable 'ansible_distribution_major_version' from source: facts 11389 1726854856.31837: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854856.31841: variable 'omit' from source: magic vars 11389 1726854856.31843: variable 'omit' from source: magic vars 11389 1726854856.31876: variable 'omit' from source: magic vars 11389 1726854856.31907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854856.31941: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854856.31965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854856.31986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854856.31999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854856.32193: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854856.32197: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854856.32200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854856.32202: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854856.32204: Set connection var ansible_timeout to 10 11389 1726854856.32207: Set connection var ansible_connection to ssh 11389 1726854856.32209: Set connection var ansible_shell_type to sh 11389 1726854856.32211: Set connection var ansible_pipelining to False 11389 1726854856.32213: Set connection var ansible_shell_executable to /bin/sh 11389 1726854856.32215: variable 'ansible_shell_executable' from source: unknown 11389 1726854856.32217: variable 'ansible_connection' from source: unknown 11389 1726854856.32220: variable 'ansible_module_compression' from source: unknown 11389 1726854856.32222: variable 'ansible_shell_type' from source: unknown 11389 1726854856.32224: variable 'ansible_shell_executable' from source: unknown 11389 1726854856.32226: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854856.32228: variable 'ansible_pipelining' from source: unknown 11389 1726854856.32230: variable 'ansible_timeout' from source: unknown 11389 1726854856.32232: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854856.32458: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11389 1726854856.32462: variable 'omit' from source: magic vars 11389 1726854856.32465: starting attempt loop 11389 1726854856.32467: running the handler 11389 1726854856.32469: _low_level_execute_command(): starting 11389 1726854856.32471: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854856.33297: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854856.33301: stderr chunk (state=3): >>>debug2: match found <<< 11389 1726854856.33378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854856.33501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854856.33618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854856.35393: stdout chunk (state=3): >>>/root <<< 11389 1726854856.35451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854856.35457: stdout chunk (state=3): >>><<< 11389 1726854856.35467: stderr chunk (state=3): >>><<< 11389 1726854856.35494: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854856.35504: _low_level_execute_command(): starting 11389 1726854856.35510: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854856.3549113-11895-245684362946751 `" && echo ansible-tmp-1726854856.3549113-11895-245684362946751="` echo /root/.ansible/tmp/ansible-tmp-1726854856.3549113-11895-245684362946751 `" ) && sleep 0' 11389 1726854856.37126: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854856.37152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854856.37240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854856.39174: stdout chunk (state=3): >>>ansible-tmp-1726854856.3549113-11895-245684362946751=/root/.ansible/tmp/ansible-tmp-1726854856.3549113-11895-245684362946751 <<< 11389 1726854856.39276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854856.39317: stderr chunk (state=3): >>><<< 11389 1726854856.39325: stdout chunk (state=3): >>><<< 11389 1726854856.39348: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854856.3549113-11895-245684362946751=/root/.ansible/tmp/ansible-tmp-1726854856.3549113-11895-245684362946751 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854856.39715: variable 'ansible_module_compression' from source: unknown 11389 1726854856.39718: ANSIBALLZ: Using lock for service_facts 11389 1726854856.39720: ANSIBALLZ: Acquiring lock 11389 1726854856.39722: ANSIBALLZ: Lock acquired: 140464421665680 11389 1726854856.39724: ANSIBALLZ: Creating module 11389 1726854856.54842: ANSIBALLZ: Writing module into payload 11389 1726854856.55023: ANSIBALLZ: Writing module 11389 1726854856.55069: ANSIBALLZ: Renaming module 11389 1726854856.55083: ANSIBALLZ: Done creating module 11389 1726854856.55150: variable 'ansible_facts' from source: unknown 11389 1726854856.55319: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854856.3549113-11895-245684362946751/AnsiballZ_service_facts.py 11389 1726854856.55726: Sending initial data 11389 1726854856.55735: Sent initial data (162 bytes) 11389 1726854856.57104: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854856.57164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854856.57218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854856.57255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854856.57355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854856.59036: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854856.59129: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854856.59200: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmprmxbj4al /root/.ansible/tmp/ansible-tmp-1726854856.3549113-11895-245684362946751/AnsiballZ_service_facts.py <<< 11389 1726854856.59230: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854856.3549113-11895-245684362946751/AnsiballZ_service_facts.py" <<< 11389 1726854856.59265: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmprmxbj4al" to remote "/root/.ansible/tmp/ansible-tmp-1726854856.3549113-11895-245684362946751/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854856.3549113-11895-245684362946751/AnsiballZ_service_facts.py" <<< 11389 1726854856.60252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854856.60428: stderr chunk (state=3): >>><<< 11389 1726854856.60431: stdout chunk (state=3): >>><<< 11389 1726854856.60433: done transferring module to remote 11389 1726854856.60435: _low_level_execute_command(): starting 11389 1726854856.60438: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854856.3549113-11895-245684362946751/ /root/.ansible/tmp/ansible-tmp-1726854856.3549113-11895-245684362946751/AnsiballZ_service_facts.py && sleep 0' 11389 1726854856.61377: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854856.61393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854856.61411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854856.61433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854856.61626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854856.63430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854856.63501: stderr chunk (state=3): >>><<< 11389 1726854856.63509: stdout chunk (state=3): >>><<< 11389 1726854856.63528: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854856.63536: _low_level_execute_command(): starting 11389 1726854856.63545: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854856.3549113-11895-245684362946751/AnsiballZ_service_facts.py && sleep 0' 11389 1726854856.64170: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854856.64184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854856.64204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854856.64219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854856.64233: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854856.64333: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854856.64367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854856.64477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854858.20271: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 11389 1726854858.20330: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11389 1726854858.21847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854858.21939: stderr chunk (state=3): >>>Shared connection to 10.31.9.244 closed. <<< 11389 1726854858.21942: stdout chunk (state=3): >>><<< 11389 1726854858.21944: stderr chunk (state=3): >>><<< 11389 1726854858.21965: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854858.24002: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854856.3549113-11895-245684362946751/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854858.24007: _low_level_execute_command(): starting 11389 1726854858.24009: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854856.3549113-11895-245684362946751/ > /dev/null 2>&1 && sleep 0' 11389 1726854858.25151: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854858.25314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854858.25339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854858.25436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854858.27294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854858.27303: stdout chunk (state=3): >>><<< 11389 1726854858.27309: stderr chunk (state=3): >>><<< 11389 1726854858.27393: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854858.27396: handler run complete 11389 1726854858.27446: variable 'ansible_facts' from source: unknown 11389 1726854858.27540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854858.28081: variable 'ansible_facts' from source: unknown 11389 1726854858.28153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854858.28376: attempt loop complete, returning result 11389 1726854858.28415: _execute() done 11389 1726854858.28428: dumping result to json 11389 1726854858.28489: done dumping result, returning 11389 1726854858.28498: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-deb8-c119-00000000018d] 11389 1726854858.28504: sending task result for task 0affcc66-ac2b-deb8-c119-00000000018d ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11389 1726854858.29108: no more pending results, returning what we have 11389 1726854858.29110: results queue empty 11389 1726854858.29111: checking for any_errors_fatal 11389 1726854858.29114: done checking for any_errors_fatal 11389 1726854858.29115: checking for max_fail_percentage 11389 1726854858.29116: done checking for max_fail_percentage 11389 1726854858.29117: checking to see if all hosts have failed and the running result is not ok 11389 1726854858.29118: done checking to see if all hosts have failed 11389 1726854858.29118: getting the remaining hosts for this loop 11389 1726854858.29119: done getting the remaining hosts for this loop 11389 1726854858.29122: getting the next task for host managed_node3 11389 1726854858.29126: done getting next task for host managed_node3 11389 1726854858.29129: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11389 1726854858.29133: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854858.29140: getting variables 11389 1726854858.29141: in VariableManager get_vars() 11389 1726854858.29168: Calling all_inventory to load vars for managed_node3 11389 1726854858.29171: Calling groups_inventory to load vars for managed_node3 11389 1726854858.29173: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854858.29180: Calling all_plugins_play to load vars for managed_node3 11389 1726854858.29183: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854858.29185: Calling groups_plugins_play to load vars for managed_node3 11389 1726854858.29561: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000018d 11389 1726854858.29565: WORKER PROCESS EXITING 11389 1726854858.29576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854858.29842: done with get_vars() 11389 1726854858.29851: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:54:18 -0400 (0:00:02.000) 0:00:10.722 ****** 11389 1726854858.29921: entering _queue_task() for managed_node3/package_facts 11389 1726854858.29922: Creating lock for package_facts 11389 1726854858.30132: worker is 1 (out of 1 available) 11389 1726854858.30145: exiting _queue_task() for managed_node3/package_facts 11389 1726854858.30156: done queuing things up, now waiting for results queue to drain 11389 1726854858.30157: waiting for pending results... 11389 1726854858.30320: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 11389 1726854858.30417: in run() - task 0affcc66-ac2b-deb8-c119-00000000018e 11389 1726854858.30453: variable 'ansible_search_path' from source: unknown 11389 1726854858.30457: variable 'ansible_search_path' from source: unknown 11389 1726854858.30468: calling self._execute() 11389 1726854858.30686: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854858.30692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854858.30695: variable 'omit' from source: magic vars 11389 1726854858.30917: variable 'ansible_distribution_major_version' from source: facts 11389 1726854858.30928: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854858.30934: variable 'omit' from source: magic vars 11389 1726854858.31009: variable 'omit' from source: magic vars 11389 1726854858.31071: variable 'omit' from source: magic vars 11389 1726854858.31384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854858.31390: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854858.31405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854858.31424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854858.31436: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854858.31494: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854858.31497: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854858.31500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854858.31848: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854858.31922: Set connection var ansible_timeout to 10 11389 1726854858.31926: Set connection var ansible_connection to ssh 11389 1726854858.31928: Set connection var ansible_shell_type to sh 11389 1726854858.31931: Set connection var ansible_pipelining to False 11389 1726854858.31933: Set connection var ansible_shell_executable to /bin/sh 11389 1726854858.31935: variable 'ansible_shell_executable' from source: unknown 11389 1726854858.31937: variable 'ansible_connection' from source: unknown 11389 1726854858.31939: variable 'ansible_module_compression' from source: unknown 11389 1726854858.31941: variable 'ansible_shell_type' from source: unknown 11389 1726854858.31944: variable 'ansible_shell_executable' from source: unknown 11389 1726854858.31945: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854858.31948: variable 'ansible_pipelining' from source: unknown 11389 1726854858.31950: variable 'ansible_timeout' from source: unknown 11389 1726854858.31952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854858.32359: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11389 1726854858.32364: variable 'omit' from source: magic vars 11389 1726854858.32367: starting attempt loop 11389 1726854858.32369: running the handler 11389 1726854858.32371: _low_level_execute_command(): starting 11389 1726854858.32373: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854858.33389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854858.33408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854858.33507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854858.35533: stdout chunk (state=3): >>>/root <<< 11389 1726854858.35537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854858.35539: stdout chunk (state=3): >>><<< 11389 1726854858.35541: stderr chunk (state=3): >>><<< 11389 1726854858.35544: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854858.35751: _low_level_execute_command(): starting 11389 1726854858.35756: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854858.3561423-11980-97765175018795 `" && echo ansible-tmp-1726854858.3561423-11980-97765175018795="` echo /root/.ansible/tmp/ansible-tmp-1726854858.3561423-11980-97765175018795 `" ) && sleep 0' 11389 1726854858.36664: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854858.37000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854858.37035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854858.37277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854858.39119: stdout chunk (state=3): >>>ansible-tmp-1726854858.3561423-11980-97765175018795=/root/.ansible/tmp/ansible-tmp-1726854858.3561423-11980-97765175018795 <<< 11389 1726854858.39651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854858.39655: stdout chunk (state=3): >>><<< 11389 1726854858.39678: stderr chunk (state=3): >>><<< 11389 1726854858.39681: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854858.3561423-11980-97765175018795=/root/.ansible/tmp/ansible-tmp-1726854858.3561423-11980-97765175018795 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854858.39726: variable 'ansible_module_compression' from source: unknown 11389 1726854858.39788: ANSIBALLZ: Using lock for package_facts 11389 1726854858.39792: ANSIBALLZ: Acquiring lock 11389 1726854858.39794: ANSIBALLZ: Lock acquired: 140464425441360 11389 1726854858.39796: ANSIBALLZ: Creating module 11389 1726854858.84721: ANSIBALLZ: Writing module into payload 11389 1726854858.84726: ANSIBALLZ: Writing module 11389 1726854858.84751: ANSIBALLZ: Renaming module 11389 1726854858.84754: ANSIBALLZ: Done creating module 11389 1726854858.84784: variable 'ansible_facts' from source: unknown 11389 1726854858.84994: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854858.3561423-11980-97765175018795/AnsiballZ_package_facts.py 11389 1726854858.85259: Sending initial data 11389 1726854858.85262: Sent initial data (161 bytes) 11389 1726854858.85791: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854858.85915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854858.85931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854858.86030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854858.87795: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854858.87868: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854858.87952: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmp5ips9fio /root/.ansible/tmp/ansible-tmp-1726854858.3561423-11980-97765175018795/AnsiballZ_package_facts.py <<< 11389 1726854858.87956: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854858.3561423-11980-97765175018795/AnsiballZ_package_facts.py" <<< 11389 1726854858.88025: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 11389 1726854858.88029: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmp5ips9fio" to remote "/root/.ansible/tmp/ansible-tmp-1726854858.3561423-11980-97765175018795/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854858.3561423-11980-97765175018795/AnsiballZ_package_facts.py" <<< 11389 1726854858.89749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854858.89754: stdout chunk (state=3): >>><<< 11389 1726854858.89756: stderr chunk (state=3): >>><<< 11389 1726854858.89790: done transferring module to remote 11389 1726854858.89850: _low_level_execute_command(): starting 11389 1726854858.89854: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854858.3561423-11980-97765175018795/ /root/.ansible/tmp/ansible-tmp-1726854858.3561423-11980-97765175018795/AnsiballZ_package_facts.py && sleep 0' 11389 1726854858.90490: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854858.90513: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854858.90618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854858.90648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854858.90677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854858.90695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854858.90792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854858.92581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854858.92615: stderr chunk (state=3): >>><<< 11389 1726854858.92619: stdout chunk (state=3): >>><<< 11389 1726854858.92634: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854858.92637: _low_level_execute_command(): starting 11389 1726854858.92643: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854858.3561423-11980-97765175018795/AnsiballZ_package_facts.py && sleep 0' 11389 1726854858.93166: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854858.93174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854858.93178: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854858.93194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854858.93266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854859.37714: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 11389 1726854859.37728: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 11389 1726854859.37748: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 11389 1726854859.37796: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 11389 1726854859.37809: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 11389 1726854859.37819: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 11389 1726854859.37825: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 11389 1726854859.37883: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 11389 1726854859.37894: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 11389 1726854859.37903: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11389 1726854859.39776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854859.39780: stdout chunk (state=3): >>><<< 11389 1726854859.39782: stderr chunk (state=3): >>><<< 11389 1726854859.39805: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854859.42452: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854858.3561423-11980-97765175018795/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854859.42564: _low_level_execute_command(): starting 11389 1726854859.42568: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854858.3561423-11980-97765175018795/ > /dev/null 2>&1 && sleep 0' 11389 1726854859.43177: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854859.43255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854859.43309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854859.43326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854859.43358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854859.43464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854859.45402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854859.45406: stdout chunk (state=3): >>><<< 11389 1726854859.45408: stderr chunk (state=3): >>><<< 11389 1726854859.45424: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854859.45435: handler run complete 11389 1726854859.46314: variable 'ansible_facts' from source: unknown 11389 1726854859.46778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854859.48714: variable 'ansible_facts' from source: unknown 11389 1726854859.49034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854859.49421: attempt loop complete, returning result 11389 1726854859.49432: _execute() done 11389 1726854859.49435: dumping result to json 11389 1726854859.49552: done dumping result, returning 11389 1726854859.49560: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-deb8-c119-00000000018e] 11389 1726854859.49564: sending task result for task 0affcc66-ac2b-deb8-c119-00000000018e 11389 1726854859.51059: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000018e 11389 1726854859.51062: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11389 1726854859.51162: no more pending results, returning what we have 11389 1726854859.51164: results queue empty 11389 1726854859.51165: checking for any_errors_fatal 11389 1726854859.51170: done checking for any_errors_fatal 11389 1726854859.51170: checking for max_fail_percentage 11389 1726854859.51175: done checking for max_fail_percentage 11389 1726854859.51176: checking to see if all hosts have failed and the running result is not ok 11389 1726854859.51176: done checking to see if all hosts have failed 11389 1726854859.51177: getting the remaining hosts for this loop 11389 1726854859.51178: done getting the remaining hosts for this loop 11389 1726854859.51183: getting the next task for host managed_node3 11389 1726854859.51192: done getting next task for host managed_node3 11389 1726854859.51196: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11389 1726854859.51199: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854859.51207: getting variables 11389 1726854859.51208: in VariableManager get_vars() 11389 1726854859.51237: Calling all_inventory to load vars for managed_node3 11389 1726854859.51240: Calling groups_inventory to load vars for managed_node3 11389 1726854859.51242: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854859.51251: Calling all_plugins_play to load vars for managed_node3 11389 1726854859.51254: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854859.51258: Calling groups_plugins_play to load vars for managed_node3 11389 1726854859.52392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854859.53285: done with get_vars() 11389 1726854859.53304: done getting variables 11389 1726854859.53346: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:54:19 -0400 (0:00:01.234) 0:00:11.956 ****** 11389 1726854859.53371: entering _queue_task() for managed_node3/debug 11389 1726854859.53604: worker is 1 (out of 1 available) 11389 1726854859.53618: exiting _queue_task() for managed_node3/debug 11389 1726854859.53628: done queuing things up, now waiting for results queue to drain 11389 1726854859.53630: waiting for pending results... 11389 1726854859.53794: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 11389 1726854859.53884: in run() - task 0affcc66-ac2b-deb8-c119-000000000027 11389 1726854859.53899: variable 'ansible_search_path' from source: unknown 11389 1726854859.53903: variable 'ansible_search_path' from source: unknown 11389 1726854859.53932: calling self._execute() 11389 1726854859.53999: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854859.54003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854859.54012: variable 'omit' from source: magic vars 11389 1726854859.54355: variable 'ansible_distribution_major_version' from source: facts 11389 1726854859.54358: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854859.54361: variable 'omit' from source: magic vars 11389 1726854859.54420: variable 'omit' from source: magic vars 11389 1726854859.54599: variable 'network_provider' from source: set_fact 11389 1726854859.54602: variable 'omit' from source: magic vars 11389 1726854859.54605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854859.54631: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854859.54657: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854859.54679: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854859.54700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854859.54739: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854859.54748: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854859.54755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854859.54929: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854859.54933: Set connection var ansible_timeout to 10 11389 1726854859.54936: Set connection var ansible_connection to ssh 11389 1726854859.54938: Set connection var ansible_shell_type to sh 11389 1726854859.54941: Set connection var ansible_pipelining to False 11389 1726854859.54943: Set connection var ansible_shell_executable to /bin/sh 11389 1726854859.54945: variable 'ansible_shell_executable' from source: unknown 11389 1726854859.54947: variable 'ansible_connection' from source: unknown 11389 1726854859.54950: variable 'ansible_module_compression' from source: unknown 11389 1726854859.54952: variable 'ansible_shell_type' from source: unknown 11389 1726854859.54954: variable 'ansible_shell_executable' from source: unknown 11389 1726854859.54956: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854859.54992: variable 'ansible_pipelining' from source: unknown 11389 1726854859.55006: variable 'ansible_timeout' from source: unknown 11389 1726854859.55037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854859.55193: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854859.55204: variable 'omit' from source: magic vars 11389 1726854859.55207: starting attempt loop 11389 1726854859.55210: running the handler 11389 1726854859.55254: handler run complete 11389 1726854859.55261: attempt loop complete, returning result 11389 1726854859.55264: _execute() done 11389 1726854859.55267: dumping result to json 11389 1726854859.55273: done dumping result, returning 11389 1726854859.55280: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-deb8-c119-000000000027] 11389 1726854859.55285: sending task result for task 0affcc66-ac2b-deb8-c119-000000000027 11389 1726854859.55373: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000027 11389 1726854859.55376: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 11389 1726854859.55431: no more pending results, returning what we have 11389 1726854859.55434: results queue empty 11389 1726854859.55435: checking for any_errors_fatal 11389 1726854859.55443: done checking for any_errors_fatal 11389 1726854859.55444: checking for max_fail_percentage 11389 1726854859.55445: done checking for max_fail_percentage 11389 1726854859.55446: checking to see if all hosts have failed and the running result is not ok 11389 1726854859.55447: done checking to see if all hosts have failed 11389 1726854859.55448: getting the remaining hosts for this loop 11389 1726854859.55449: done getting the remaining hosts for this loop 11389 1726854859.55453: getting the next task for host managed_node3 11389 1726854859.55460: done getting next task for host managed_node3 11389 1726854859.55476: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11389 1726854859.55479: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854859.55491: getting variables 11389 1726854859.55493: in VariableManager get_vars() 11389 1726854859.55528: Calling all_inventory to load vars for managed_node3 11389 1726854859.55530: Calling groups_inventory to load vars for managed_node3 11389 1726854859.55532: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854859.55539: Calling all_plugins_play to load vars for managed_node3 11389 1726854859.55542: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854859.55544: Calling groups_plugins_play to load vars for managed_node3 11389 1726854859.56349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854859.57211: done with get_vars() 11389 1726854859.57228: done getting variables 11389 1726854859.57295: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:54:19 -0400 (0:00:00.039) 0:00:11.996 ****** 11389 1726854859.57321: entering _queue_task() for managed_node3/fail 11389 1726854859.57322: Creating lock for fail 11389 1726854859.57559: worker is 1 (out of 1 available) 11389 1726854859.57574: exiting _queue_task() for managed_node3/fail 11389 1726854859.57585: done queuing things up, now waiting for results queue to drain 11389 1726854859.57586: waiting for pending results... 11389 1726854859.57751: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11389 1726854859.57837: in run() - task 0affcc66-ac2b-deb8-c119-000000000028 11389 1726854859.57848: variable 'ansible_search_path' from source: unknown 11389 1726854859.57852: variable 'ansible_search_path' from source: unknown 11389 1726854859.57884: calling self._execute() 11389 1726854859.57949: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854859.57953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854859.57962: variable 'omit' from source: magic vars 11389 1726854859.58229: variable 'ansible_distribution_major_version' from source: facts 11389 1726854859.58238: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854859.58325: variable 'network_state' from source: role '' defaults 11389 1726854859.58334: Evaluated conditional (network_state != {}): False 11389 1726854859.58337: when evaluation is False, skipping this task 11389 1726854859.58340: _execute() done 11389 1726854859.58342: dumping result to json 11389 1726854859.58345: done dumping result, returning 11389 1726854859.58354: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-deb8-c119-000000000028] 11389 1726854859.58358: sending task result for task 0affcc66-ac2b-deb8-c119-000000000028 11389 1726854859.58443: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000028 11389 1726854859.58446: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11389 1726854859.58522: no more pending results, returning what we have 11389 1726854859.58526: results queue empty 11389 1726854859.58526: checking for any_errors_fatal 11389 1726854859.58533: done checking for any_errors_fatal 11389 1726854859.58534: checking for max_fail_percentage 11389 1726854859.58536: done checking for max_fail_percentage 11389 1726854859.58537: checking to see if all hosts have failed and the running result is not ok 11389 1726854859.58538: done checking to see if all hosts have failed 11389 1726854859.58539: getting the remaining hosts for this loop 11389 1726854859.58540: done getting the remaining hosts for this loop 11389 1726854859.58543: getting the next task for host managed_node3 11389 1726854859.58548: done getting next task for host managed_node3 11389 1726854859.58551: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11389 1726854859.58554: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854859.58567: getting variables 11389 1726854859.58569: in VariableManager get_vars() 11389 1726854859.58610: Calling all_inventory to load vars for managed_node3 11389 1726854859.58613: Calling groups_inventory to load vars for managed_node3 11389 1726854859.58615: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854859.58624: Calling all_plugins_play to load vars for managed_node3 11389 1726854859.58626: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854859.58629: Calling groups_plugins_play to load vars for managed_node3 11389 1726854859.59372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854859.60326: done with get_vars() 11389 1726854859.60342: done getting variables 11389 1726854859.60385: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:54:19 -0400 (0:00:00.030) 0:00:12.027 ****** 11389 1726854859.60410: entering _queue_task() for managed_node3/fail 11389 1726854859.60635: worker is 1 (out of 1 available) 11389 1726854859.60648: exiting _queue_task() for managed_node3/fail 11389 1726854859.60660: done queuing things up, now waiting for results queue to drain 11389 1726854859.60662: waiting for pending results... 11389 1726854859.60823: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11389 1726854859.60904: in run() - task 0affcc66-ac2b-deb8-c119-000000000029 11389 1726854859.60915: variable 'ansible_search_path' from source: unknown 11389 1726854859.60919: variable 'ansible_search_path' from source: unknown 11389 1726854859.60947: calling self._execute() 11389 1726854859.61011: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854859.61015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854859.61023: variable 'omit' from source: magic vars 11389 1726854859.61279: variable 'ansible_distribution_major_version' from source: facts 11389 1726854859.61289: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854859.61369: variable 'network_state' from source: role '' defaults 11389 1726854859.61379: Evaluated conditional (network_state != {}): False 11389 1726854859.61383: when evaluation is False, skipping this task 11389 1726854859.61386: _execute() done 11389 1726854859.61390: dumping result to json 11389 1726854859.61393: done dumping result, returning 11389 1726854859.61399: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-deb8-c119-000000000029] 11389 1726854859.61404: sending task result for task 0affcc66-ac2b-deb8-c119-000000000029 11389 1726854859.61486: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000029 11389 1726854859.61490: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11389 1726854859.61572: no more pending results, returning what we have 11389 1726854859.61574: results queue empty 11389 1726854859.61575: checking for any_errors_fatal 11389 1726854859.61580: done checking for any_errors_fatal 11389 1726854859.61581: checking for max_fail_percentage 11389 1726854859.61583: done checking for max_fail_percentage 11389 1726854859.61583: checking to see if all hosts have failed and the running result is not ok 11389 1726854859.61584: done checking to see if all hosts have failed 11389 1726854859.61585: getting the remaining hosts for this loop 11389 1726854859.61586: done getting the remaining hosts for this loop 11389 1726854859.61591: getting the next task for host managed_node3 11389 1726854859.61596: done getting next task for host managed_node3 11389 1726854859.61600: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11389 1726854859.61603: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854859.61616: getting variables 11389 1726854859.61618: in VariableManager get_vars() 11389 1726854859.61648: Calling all_inventory to load vars for managed_node3 11389 1726854859.61650: Calling groups_inventory to load vars for managed_node3 11389 1726854859.61652: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854859.61660: Calling all_plugins_play to load vars for managed_node3 11389 1726854859.61662: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854859.61663: Calling groups_plugins_play to load vars for managed_node3 11389 1726854859.62367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854859.63225: done with get_vars() 11389 1726854859.63242: done getting variables 11389 1726854859.63285: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:54:19 -0400 (0:00:00.028) 0:00:12.056 ****** 11389 1726854859.63310: entering _queue_task() for managed_node3/fail 11389 1726854859.63527: worker is 1 (out of 1 available) 11389 1726854859.63540: exiting _queue_task() for managed_node3/fail 11389 1726854859.63552: done queuing things up, now waiting for results queue to drain 11389 1726854859.63554: waiting for pending results... 11389 1726854859.63720: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11389 1726854859.63802: in run() - task 0affcc66-ac2b-deb8-c119-00000000002a 11389 1726854859.63814: variable 'ansible_search_path' from source: unknown 11389 1726854859.63818: variable 'ansible_search_path' from source: unknown 11389 1726854859.63845: calling self._execute() 11389 1726854859.63914: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854859.63918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854859.63929: variable 'omit' from source: magic vars 11389 1726854859.64185: variable 'ansible_distribution_major_version' from source: facts 11389 1726854859.64195: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854859.64318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854859.65782: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854859.65826: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854859.65852: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854859.65884: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854859.65905: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854859.65970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854859.65988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854859.66006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854859.66032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854859.66042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854859.66110: variable 'ansible_distribution_major_version' from source: facts 11389 1726854859.66121: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11389 1726854859.66201: variable 'ansible_distribution' from source: facts 11389 1726854859.66204: variable '__network_rh_distros' from source: role '' defaults 11389 1726854859.66211: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11389 1726854859.66363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854859.66381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854859.66402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854859.66427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854859.66437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854859.66472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854859.66486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854859.66510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854859.66532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854859.66543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854859.66573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854859.66589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854859.66606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854859.66633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854859.66643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854859.66836: variable 'network_connections' from source: task vars 11389 1726854859.66839: variable 'controller_profile' from source: play vars 11389 1726854859.66884: variable 'controller_profile' from source: play vars 11389 1726854859.66894: variable 'controller_device' from source: play vars 11389 1726854859.66937: variable 'controller_device' from source: play vars 11389 1726854859.66948: variable 'port1_profile' from source: play vars 11389 1726854859.66989: variable 'port1_profile' from source: play vars 11389 1726854859.66995: variable 'dhcp_interface1' from source: play vars 11389 1726854859.67036: variable 'dhcp_interface1' from source: play vars 11389 1726854859.67042: variable 'controller_profile' from source: play vars 11389 1726854859.67089: variable 'controller_profile' from source: play vars 11389 1726854859.67095: variable 'port2_profile' from source: play vars 11389 1726854859.67136: variable 'port2_profile' from source: play vars 11389 1726854859.67142: variable 'dhcp_interface2' from source: play vars 11389 1726854859.67186: variable 'dhcp_interface2' from source: play vars 11389 1726854859.67193: variable 'controller_profile' from source: play vars 11389 1726854859.67471: variable 'controller_profile' from source: play vars 11389 1726854859.67475: variable 'network_state' from source: role '' defaults 11389 1726854859.67524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854859.67632: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854859.67659: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854859.67681: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854859.67705: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854859.67735: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854859.67750: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854859.67766: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854859.67786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854859.67817: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11389 1726854859.67821: when evaluation is False, skipping this task 11389 1726854859.67823: _execute() done 11389 1726854859.67825: dumping result to json 11389 1726854859.67827: done dumping result, returning 11389 1726854859.67838: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-deb8-c119-00000000002a] 11389 1726854859.67840: sending task result for task 0affcc66-ac2b-deb8-c119-00000000002a 11389 1726854859.67920: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000002a 11389 1726854859.67923: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11389 1726854859.67982: no more pending results, returning what we have 11389 1726854859.67985: results queue empty 11389 1726854859.67986: checking for any_errors_fatal 11389 1726854859.67997: done checking for any_errors_fatal 11389 1726854859.67998: checking for max_fail_percentage 11389 1726854859.68000: done checking for max_fail_percentage 11389 1726854859.68001: checking to see if all hosts have failed and the running result is not ok 11389 1726854859.68002: done checking to see if all hosts have failed 11389 1726854859.68003: getting the remaining hosts for this loop 11389 1726854859.68004: done getting the remaining hosts for this loop 11389 1726854859.68008: getting the next task for host managed_node3 11389 1726854859.68014: done getting next task for host managed_node3 11389 1726854859.68018: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11389 1726854859.68020: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854859.68034: getting variables 11389 1726854859.68035: in VariableManager get_vars() 11389 1726854859.68075: Calling all_inventory to load vars for managed_node3 11389 1726854859.68078: Calling groups_inventory to load vars for managed_node3 11389 1726854859.68080: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854859.68090: Calling all_plugins_play to load vars for managed_node3 11389 1726854859.68092: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854859.68095: Calling groups_plugins_play to load vars for managed_node3 11389 1726854859.68974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854859.69821: done with get_vars() 11389 1726854859.69837: done getting variables 11389 1726854859.69908: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:54:19 -0400 (0:00:00.066) 0:00:12.122 ****** 11389 1726854859.69930: entering _queue_task() for managed_node3/dnf 11389 1726854859.70163: worker is 1 (out of 1 available) 11389 1726854859.70178: exiting _queue_task() for managed_node3/dnf 11389 1726854859.70191: done queuing things up, now waiting for results queue to drain 11389 1726854859.70193: waiting for pending results... 11389 1726854859.70360: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11389 1726854859.70440: in run() - task 0affcc66-ac2b-deb8-c119-00000000002b 11389 1726854859.70452: variable 'ansible_search_path' from source: unknown 11389 1726854859.70455: variable 'ansible_search_path' from source: unknown 11389 1726854859.70483: calling self._execute() 11389 1726854859.70548: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854859.70551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854859.70559: variable 'omit' from source: magic vars 11389 1726854859.70815: variable 'ansible_distribution_major_version' from source: facts 11389 1726854859.70825: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854859.70959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854859.72915: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854859.72961: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854859.72989: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854859.73016: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854859.73036: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854859.73096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854859.73119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854859.73137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854859.73163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854859.73174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854859.73258: variable 'ansible_distribution' from source: facts 11389 1726854859.73262: variable 'ansible_distribution_major_version' from source: facts 11389 1726854859.73275: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11389 1726854859.73355: variable '__network_wireless_connections_defined' from source: role '' defaults 11389 1726854859.73442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854859.73457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854859.73475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854859.73502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854859.73512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854859.73541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854859.73558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854859.73575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854859.73601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854859.73611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854859.73637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854859.73658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854859.73772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854859.73776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854859.73778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854859.73804: variable 'network_connections' from source: task vars 11389 1726854859.73813: variable 'controller_profile' from source: play vars 11389 1726854859.73857: variable 'controller_profile' from source: play vars 11389 1726854859.73866: variable 'controller_device' from source: play vars 11389 1726854859.73912: variable 'controller_device' from source: play vars 11389 1726854859.73923: variable 'port1_profile' from source: play vars 11389 1726854859.73962: variable 'port1_profile' from source: play vars 11389 1726854859.73971: variable 'dhcp_interface1' from source: play vars 11389 1726854859.74016: variable 'dhcp_interface1' from source: play vars 11389 1726854859.74022: variable 'controller_profile' from source: play vars 11389 1726854859.74063: variable 'controller_profile' from source: play vars 11389 1726854859.74071: variable 'port2_profile' from source: play vars 11389 1726854859.74114: variable 'port2_profile' from source: play vars 11389 1726854859.74121: variable 'dhcp_interface2' from source: play vars 11389 1726854859.74161: variable 'dhcp_interface2' from source: play vars 11389 1726854859.74167: variable 'controller_profile' from source: play vars 11389 1726854859.74212: variable 'controller_profile' from source: play vars 11389 1726854859.74275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854859.74583: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854859.74598: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854859.74628: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854859.74655: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854859.74699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854859.74725: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854859.74792: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854859.74797: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854859.74850: variable '__network_team_connections_defined' from source: role '' defaults 11389 1726854859.75065: variable 'network_connections' from source: task vars 11389 1726854859.75075: variable 'controller_profile' from source: play vars 11389 1726854859.75135: variable 'controller_profile' from source: play vars 11389 1726854859.75146: variable 'controller_device' from source: play vars 11389 1726854859.75392: variable 'controller_device' from source: play vars 11389 1726854859.75395: variable 'port1_profile' from source: play vars 11389 1726854859.75397: variable 'port1_profile' from source: play vars 11389 1726854859.75399: variable 'dhcp_interface1' from source: play vars 11389 1726854859.75401: variable 'dhcp_interface1' from source: play vars 11389 1726854859.75403: variable 'controller_profile' from source: play vars 11389 1726854859.75405: variable 'controller_profile' from source: play vars 11389 1726854859.75407: variable 'port2_profile' from source: play vars 11389 1726854859.75462: variable 'port2_profile' from source: play vars 11389 1726854859.75474: variable 'dhcp_interface2' from source: play vars 11389 1726854859.75535: variable 'dhcp_interface2' from source: play vars 11389 1726854859.75546: variable 'controller_profile' from source: play vars 11389 1726854859.75606: variable 'controller_profile' from source: play vars 11389 1726854859.75641: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11389 1726854859.75648: when evaluation is False, skipping this task 11389 1726854859.75654: _execute() done 11389 1726854859.75660: dumping result to json 11389 1726854859.75667: done dumping result, returning 11389 1726854859.75679: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-deb8-c119-00000000002b] 11389 1726854859.75689: sending task result for task 0affcc66-ac2b-deb8-c119-00000000002b skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11389 1726854859.75833: no more pending results, returning what we have 11389 1726854859.75836: results queue empty 11389 1726854859.75837: checking for any_errors_fatal 11389 1726854859.75844: done checking for any_errors_fatal 11389 1726854859.75844: checking for max_fail_percentage 11389 1726854859.75846: done checking for max_fail_percentage 11389 1726854859.75847: checking to see if all hosts have failed and the running result is not ok 11389 1726854859.75848: done checking to see if all hosts have failed 11389 1726854859.75849: getting the remaining hosts for this loop 11389 1726854859.75850: done getting the remaining hosts for this loop 11389 1726854859.75853: getting the next task for host managed_node3 11389 1726854859.75860: done getting next task for host managed_node3 11389 1726854859.75864: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11389 1726854859.75866: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854859.75882: getting variables 11389 1726854859.75883: in VariableManager get_vars() 11389 1726854859.75926: Calling all_inventory to load vars for managed_node3 11389 1726854859.75929: Calling groups_inventory to load vars for managed_node3 11389 1726854859.75931: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854859.75941: Calling all_plugins_play to load vars for managed_node3 11389 1726854859.75943: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854859.75945: Calling groups_plugins_play to load vars for managed_node3 11389 1726854859.76510: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000002b 11389 1726854859.76513: WORKER PROCESS EXITING 11389 1726854859.77423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854859.79089: done with get_vars() 11389 1726854859.79115: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11389 1726854859.79194: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:54:19 -0400 (0:00:00.092) 0:00:12.215 ****** 11389 1726854859.79225: entering _queue_task() for managed_node3/yum 11389 1726854859.79227: Creating lock for yum 11389 1726854859.79559: worker is 1 (out of 1 available) 11389 1726854859.79573: exiting _queue_task() for managed_node3/yum 11389 1726854859.79588: done queuing things up, now waiting for results queue to drain 11389 1726854859.79590: waiting for pending results... 11389 1726854859.79854: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11389 1726854859.79985: in run() - task 0affcc66-ac2b-deb8-c119-00000000002c 11389 1726854859.80007: variable 'ansible_search_path' from source: unknown 11389 1726854859.80016: variable 'ansible_search_path' from source: unknown 11389 1726854859.80057: calling self._execute() 11389 1726854859.80150: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854859.80160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854859.80178: variable 'omit' from source: magic vars 11389 1726854859.80552: variable 'ansible_distribution_major_version' from source: facts 11389 1726854859.80577: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854859.80760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854859.83342: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854859.83593: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854859.83597: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854859.83599: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854859.83601: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854859.83610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854859.83643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854859.83674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854859.83728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854859.83747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854859.83850: variable 'ansible_distribution_major_version' from source: facts 11389 1726854859.83874: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11389 1726854859.83882: when evaluation is False, skipping this task 11389 1726854859.83891: _execute() done 11389 1726854859.83899: dumping result to json 11389 1726854859.83909: done dumping result, returning 11389 1726854859.83927: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-deb8-c119-00000000002c] 11389 1726854859.83950: sending task result for task 0affcc66-ac2b-deb8-c119-00000000002c skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11389 1726854859.84205: no more pending results, returning what we have 11389 1726854859.84209: results queue empty 11389 1726854859.84210: checking for any_errors_fatal 11389 1726854859.84215: done checking for any_errors_fatal 11389 1726854859.84216: checking for max_fail_percentage 11389 1726854859.84218: done checking for max_fail_percentage 11389 1726854859.84219: checking to see if all hosts have failed and the running result is not ok 11389 1726854859.84220: done checking to see if all hosts have failed 11389 1726854859.84221: getting the remaining hosts for this loop 11389 1726854859.84222: done getting the remaining hosts for this loop 11389 1726854859.84226: getting the next task for host managed_node3 11389 1726854859.84234: done getting next task for host managed_node3 11389 1726854859.84238: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11389 1726854859.84241: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854859.84391: getting variables 11389 1726854859.84394: in VariableManager get_vars() 11389 1726854859.84436: Calling all_inventory to load vars for managed_node3 11389 1726854859.84439: Calling groups_inventory to load vars for managed_node3 11389 1726854859.84441: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854859.84450: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000002c 11389 1726854859.84453: WORKER PROCESS EXITING 11389 1726854859.84463: Calling all_plugins_play to load vars for managed_node3 11389 1726854859.84466: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854859.84472: Calling groups_plugins_play to load vars for managed_node3 11389 1726854859.86048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854859.89324: done with get_vars() 11389 1726854859.89358: done getting variables 11389 1726854859.89590: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:54:19 -0400 (0:00:00.103) 0:00:12.319 ****** 11389 1726854859.89629: entering _queue_task() for managed_node3/fail 11389 1726854859.90458: worker is 1 (out of 1 available) 11389 1726854859.90473: exiting _queue_task() for managed_node3/fail 11389 1726854859.90485: done queuing things up, now waiting for results queue to drain 11389 1726854859.90488: waiting for pending results... 11389 1726854859.90804: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11389 1726854859.91125: in run() - task 0affcc66-ac2b-deb8-c119-00000000002d 11389 1726854859.91140: variable 'ansible_search_path' from source: unknown 11389 1726854859.91143: variable 'ansible_search_path' from source: unknown 11389 1726854859.91182: calling self._execute() 11389 1726854859.91456: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854859.91475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854859.91540: variable 'omit' from source: magic vars 11389 1726854859.92122: variable 'ansible_distribution_major_version' from source: facts 11389 1726854859.92139: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854859.92279: variable '__network_wireless_connections_defined' from source: role '' defaults 11389 1726854859.92523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854859.95027: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854859.95076: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854859.95104: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854859.95130: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854859.95151: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854859.95212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854859.95234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854859.95253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854859.95283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854859.95295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854859.95327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854859.95344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854859.95362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854859.95394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854859.95404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854859.95431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854859.95447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854859.95466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854859.95494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854859.95505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854859.95621: variable 'network_connections' from source: task vars 11389 1726854859.95630: variable 'controller_profile' from source: play vars 11389 1726854859.95681: variable 'controller_profile' from source: play vars 11389 1726854859.95691: variable 'controller_device' from source: play vars 11389 1726854859.95735: variable 'controller_device' from source: play vars 11389 1726854859.95743: variable 'port1_profile' from source: play vars 11389 1726854859.95785: variable 'port1_profile' from source: play vars 11389 1726854859.95798: variable 'dhcp_interface1' from source: play vars 11389 1726854859.95842: variable 'dhcp_interface1' from source: play vars 11389 1726854859.95848: variable 'controller_profile' from source: play vars 11389 1726854859.95893: variable 'controller_profile' from source: play vars 11389 1726854859.95904: variable 'port2_profile' from source: play vars 11389 1726854859.95942: variable 'port2_profile' from source: play vars 11389 1726854859.95949: variable 'dhcp_interface2' from source: play vars 11389 1726854859.95992: variable 'dhcp_interface2' from source: play vars 11389 1726854859.95997: variable 'controller_profile' from source: play vars 11389 1726854859.96041: variable 'controller_profile' from source: play vars 11389 1726854859.96089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854859.96391: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854859.96394: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854859.96396: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854859.96398: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854859.96400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854859.96403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854859.96430: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854859.96461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854859.96538: variable '__network_team_connections_defined' from source: role '' defaults 11389 1726854859.96765: variable 'network_connections' from source: task vars 11389 1726854859.96775: variable 'controller_profile' from source: play vars 11389 1726854859.96838: variable 'controller_profile' from source: play vars 11389 1726854859.96848: variable 'controller_device' from source: play vars 11389 1726854859.96911: variable 'controller_device' from source: play vars 11389 1726854859.96924: variable 'port1_profile' from source: play vars 11389 1726854859.96985: variable 'port1_profile' from source: play vars 11389 1726854859.97000: variable 'dhcp_interface1' from source: play vars 11389 1726854859.97061: variable 'dhcp_interface1' from source: play vars 11389 1726854859.97073: variable 'controller_profile' from source: play vars 11389 1726854859.97139: variable 'controller_profile' from source: play vars 11389 1726854859.97155: variable 'port2_profile' from source: play vars 11389 1726854859.97216: variable 'port2_profile' from source: play vars 11389 1726854859.97229: variable 'dhcp_interface2' from source: play vars 11389 1726854859.97297: variable 'dhcp_interface2' from source: play vars 11389 1726854859.97302: variable 'controller_profile' from source: play vars 11389 1726854859.97362: variable 'controller_profile' from source: play vars 11389 1726854859.97395: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11389 1726854859.97398: when evaluation is False, skipping this task 11389 1726854859.97401: _execute() done 11389 1726854859.97403: dumping result to json 11389 1726854859.97405: done dumping result, returning 11389 1726854859.97413: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-deb8-c119-00000000002d] 11389 1726854859.97418: sending task result for task 0affcc66-ac2b-deb8-c119-00000000002d skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11389 1726854859.97548: no more pending results, returning what we have 11389 1726854859.97552: results queue empty 11389 1726854859.97553: checking for any_errors_fatal 11389 1726854859.97557: done checking for any_errors_fatal 11389 1726854859.97557: checking for max_fail_percentage 11389 1726854859.97559: done checking for max_fail_percentage 11389 1726854859.97560: checking to see if all hosts have failed and the running result is not ok 11389 1726854859.97561: done checking to see if all hosts have failed 11389 1726854859.97562: getting the remaining hosts for this loop 11389 1726854859.97564: done getting the remaining hosts for this loop 11389 1726854859.97567: getting the next task for host managed_node3 11389 1726854859.97573: done getting next task for host managed_node3 11389 1726854859.97577: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11389 1726854859.97579: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854859.97595: getting variables 11389 1726854859.97596: in VariableManager get_vars() 11389 1726854859.97634: Calling all_inventory to load vars for managed_node3 11389 1726854859.97637: Calling groups_inventory to load vars for managed_node3 11389 1726854859.97639: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854859.97648: Calling all_plugins_play to load vars for managed_node3 11389 1726854859.97651: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854859.97653: Calling groups_plugins_play to load vars for managed_node3 11389 1726854859.98454: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000002d 11389 1726854859.98458: WORKER PROCESS EXITING 11389 1726854859.98470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854859.99517: done with get_vars() 11389 1726854859.99538: done getting variables 11389 1726854859.99597: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:54:19 -0400 (0:00:00.099) 0:00:12.419 ****** 11389 1726854859.99630: entering _queue_task() for managed_node3/package 11389 1726854859.99918: worker is 1 (out of 1 available) 11389 1726854859.99929: exiting _queue_task() for managed_node3/package 11389 1726854859.99942: done queuing things up, now waiting for results queue to drain 11389 1726854859.99943: waiting for pending results... 11389 1726854860.00401: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 11389 1726854860.00406: in run() - task 0affcc66-ac2b-deb8-c119-00000000002e 11389 1726854860.00409: variable 'ansible_search_path' from source: unknown 11389 1726854860.00411: variable 'ansible_search_path' from source: unknown 11389 1726854860.00414: calling self._execute() 11389 1726854860.00481: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854860.00496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854860.00512: variable 'omit' from source: magic vars 11389 1726854860.00860: variable 'ansible_distribution_major_version' from source: facts 11389 1726854860.00872: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854860.01010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854860.01200: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854860.01231: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854860.01254: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854860.01280: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854860.01355: variable 'network_packages' from source: role '' defaults 11389 1726854860.01432: variable '__network_provider_setup' from source: role '' defaults 11389 1726854860.01440: variable '__network_service_name_default_nm' from source: role '' defaults 11389 1726854860.01490: variable '__network_service_name_default_nm' from source: role '' defaults 11389 1726854860.01495: variable '__network_packages_default_nm' from source: role '' defaults 11389 1726854860.01539: variable '__network_packages_default_nm' from source: role '' defaults 11389 1726854860.01650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854860.03245: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854860.03393: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854860.03396: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854860.03399: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854860.03401: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854860.03464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854860.03496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854860.03526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854860.03570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854860.03591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854860.03640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854860.03667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854860.03700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854860.03744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854860.03764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854860.03986: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11389 1726854860.04101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854860.04128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854860.04155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854860.04200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854860.04220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854860.04312: variable 'ansible_python' from source: facts 11389 1726854860.04418: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11389 1726854860.04428: variable '__network_wpa_supplicant_required' from source: role '' defaults 11389 1726854860.04500: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11389 1726854860.04583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854860.04607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854860.04623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854860.04652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854860.04663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854860.04697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854860.04716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854860.04732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854860.04760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854860.04773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854860.04866: variable 'network_connections' from source: task vars 11389 1726854860.04872: variable 'controller_profile' from source: play vars 11389 1726854860.04939: variable 'controller_profile' from source: play vars 11389 1726854860.04948: variable 'controller_device' from source: play vars 11389 1726854860.05019: variable 'controller_device' from source: play vars 11389 1726854860.05029: variable 'port1_profile' from source: play vars 11389 1726854860.05100: variable 'port1_profile' from source: play vars 11389 1726854860.05108: variable 'dhcp_interface1' from source: play vars 11389 1726854860.05181: variable 'dhcp_interface1' from source: play vars 11389 1726854860.05184: variable 'controller_profile' from source: play vars 11389 1726854860.05250: variable 'controller_profile' from source: play vars 11389 1726854860.05257: variable 'port2_profile' from source: play vars 11389 1726854860.05329: variable 'port2_profile' from source: play vars 11389 1726854860.05337: variable 'dhcp_interface2' from source: play vars 11389 1726854860.05410: variable 'dhcp_interface2' from source: play vars 11389 1726854860.05413: variable 'controller_profile' from source: play vars 11389 1726854860.05480: variable 'controller_profile' from source: play vars 11389 1726854860.05537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854860.05556: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854860.05577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854860.05600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854860.05639: variable '__network_wireless_connections_defined' from source: role '' defaults 11389 1726854860.05817: variable 'network_connections' from source: task vars 11389 1726854860.05821: variable 'controller_profile' from source: play vars 11389 1726854860.05893: variable 'controller_profile' from source: play vars 11389 1726854860.05901: variable 'controller_device' from source: play vars 11389 1726854860.05972: variable 'controller_device' from source: play vars 11389 1726854860.05979: variable 'port1_profile' from source: play vars 11389 1726854860.06050: variable 'port1_profile' from source: play vars 11389 1726854860.06054: variable 'dhcp_interface1' from source: play vars 11389 1726854860.06123: variable 'dhcp_interface1' from source: play vars 11389 1726854860.06130: variable 'controller_profile' from source: play vars 11389 1726854860.06201: variable 'controller_profile' from source: play vars 11389 1726854860.06215: variable 'port2_profile' from source: play vars 11389 1726854860.06392: variable 'port2_profile' from source: play vars 11389 1726854860.06395: variable 'dhcp_interface2' from source: play vars 11389 1726854860.06413: variable 'dhcp_interface2' from source: play vars 11389 1726854860.06426: variable 'controller_profile' from source: play vars 11389 1726854860.06522: variable 'controller_profile' from source: play vars 11389 1726854860.06582: variable '__network_packages_default_wireless' from source: role '' defaults 11389 1726854860.06671: variable '__network_wireless_connections_defined' from source: role '' defaults 11389 1726854860.07019: variable 'network_connections' from source: task vars 11389 1726854860.07032: variable 'controller_profile' from source: play vars 11389 1726854860.07098: variable 'controller_profile' from source: play vars 11389 1726854860.07111: variable 'controller_device' from source: play vars 11389 1726854860.07193: variable 'controller_device' from source: play vars 11389 1726854860.07196: variable 'port1_profile' from source: play vars 11389 1726854860.07260: variable 'port1_profile' from source: play vars 11389 1726854860.07292: variable 'dhcp_interface1' from source: play vars 11389 1726854860.07354: variable 'dhcp_interface1' from source: play vars 11389 1726854860.07357: variable 'controller_profile' from source: play vars 11389 1726854860.07407: variable 'controller_profile' from source: play vars 11389 1726854860.07412: variable 'port2_profile' from source: play vars 11389 1726854860.07462: variable 'port2_profile' from source: play vars 11389 1726854860.07469: variable 'dhcp_interface2' from source: play vars 11389 1726854860.07517: variable 'dhcp_interface2' from source: play vars 11389 1726854860.07522: variable 'controller_profile' from source: play vars 11389 1726854860.07568: variable 'controller_profile' from source: play vars 11389 1726854860.07602: variable '__network_packages_default_team' from source: role '' defaults 11389 1726854860.07654: variable '__network_team_connections_defined' from source: role '' defaults 11389 1726854860.07851: variable 'network_connections' from source: task vars 11389 1726854860.07856: variable 'controller_profile' from source: play vars 11389 1726854860.07906: variable 'controller_profile' from source: play vars 11389 1726854860.07912: variable 'controller_device' from source: play vars 11389 1726854860.07956: variable 'controller_device' from source: play vars 11389 1726854860.07963: variable 'port1_profile' from source: play vars 11389 1726854860.08012: variable 'port1_profile' from source: play vars 11389 1726854860.08018: variable 'dhcp_interface1' from source: play vars 11389 1726854860.08062: variable 'dhcp_interface1' from source: play vars 11389 1726854860.08070: variable 'controller_profile' from source: play vars 11389 1726854860.08116: variable 'controller_profile' from source: play vars 11389 1726854860.08119: variable 'port2_profile' from source: play vars 11389 1726854860.08164: variable 'port2_profile' from source: play vars 11389 1726854860.08172: variable 'dhcp_interface2' from source: play vars 11389 1726854860.08217: variable 'dhcp_interface2' from source: play vars 11389 1726854860.08220: variable 'controller_profile' from source: play vars 11389 1726854860.08267: variable 'controller_profile' from source: play vars 11389 1726854860.08315: variable '__network_service_name_default_initscripts' from source: role '' defaults 11389 1726854860.08358: variable '__network_service_name_default_initscripts' from source: role '' defaults 11389 1726854860.08364: variable '__network_packages_default_initscripts' from source: role '' defaults 11389 1726854860.08407: variable '__network_packages_default_initscripts' from source: role '' defaults 11389 1726854860.08542: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11389 1726854860.08834: variable 'network_connections' from source: task vars 11389 1726854860.08838: variable 'controller_profile' from source: play vars 11389 1726854860.08885: variable 'controller_profile' from source: play vars 11389 1726854860.08895: variable 'controller_device' from source: play vars 11389 1726854860.08931: variable 'controller_device' from source: play vars 11389 1726854860.08938: variable 'port1_profile' from source: play vars 11389 1726854860.08979: variable 'port1_profile' from source: play vars 11389 1726854860.08986: variable 'dhcp_interface1' from source: play vars 11389 1726854860.09029: variable 'dhcp_interface1' from source: play vars 11389 1726854860.09035: variable 'controller_profile' from source: play vars 11389 1726854860.09077: variable 'controller_profile' from source: play vars 11389 1726854860.09082: variable 'port2_profile' from source: play vars 11389 1726854860.09127: variable 'port2_profile' from source: play vars 11389 1726854860.09133: variable 'dhcp_interface2' from source: play vars 11389 1726854860.09173: variable 'dhcp_interface2' from source: play vars 11389 1726854860.09183: variable 'controller_profile' from source: play vars 11389 1726854860.09228: variable 'controller_profile' from source: play vars 11389 1726854860.09235: variable 'ansible_distribution' from source: facts 11389 1726854860.09238: variable '__network_rh_distros' from source: role '' defaults 11389 1726854860.09244: variable 'ansible_distribution_major_version' from source: facts 11389 1726854860.09263: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11389 1726854860.09372: variable 'ansible_distribution' from source: facts 11389 1726854860.09376: variable '__network_rh_distros' from source: role '' defaults 11389 1726854860.09378: variable 'ansible_distribution_major_version' from source: facts 11389 1726854860.09390: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11389 1726854860.09498: variable 'ansible_distribution' from source: facts 11389 1726854860.09501: variable '__network_rh_distros' from source: role '' defaults 11389 1726854860.09505: variable 'ansible_distribution_major_version' from source: facts 11389 1726854860.09533: variable 'network_provider' from source: set_fact 11389 1726854860.09546: variable 'ansible_facts' from source: unknown 11389 1726854860.09983: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11389 1726854860.09989: when evaluation is False, skipping this task 11389 1726854860.09992: _execute() done 11389 1726854860.09994: dumping result to json 11389 1726854860.09996: done dumping result, returning 11389 1726854860.10003: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-deb8-c119-00000000002e] 11389 1726854860.10008: sending task result for task 0affcc66-ac2b-deb8-c119-00000000002e 11389 1726854860.10097: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000002e 11389 1726854860.10100: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11389 1726854860.10151: no more pending results, returning what we have 11389 1726854860.10155: results queue empty 11389 1726854860.10155: checking for any_errors_fatal 11389 1726854860.10162: done checking for any_errors_fatal 11389 1726854860.10163: checking for max_fail_percentage 11389 1726854860.10165: done checking for max_fail_percentage 11389 1726854860.10165: checking to see if all hosts have failed and the running result is not ok 11389 1726854860.10167: done checking to see if all hosts have failed 11389 1726854860.10170: getting the remaining hosts for this loop 11389 1726854860.10171: done getting the remaining hosts for this loop 11389 1726854860.10175: getting the next task for host managed_node3 11389 1726854860.10181: done getting next task for host managed_node3 11389 1726854860.10185: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11389 1726854860.10189: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854860.10203: getting variables 11389 1726854860.10205: in VariableManager get_vars() 11389 1726854860.10245: Calling all_inventory to load vars for managed_node3 11389 1726854860.10248: Calling groups_inventory to load vars for managed_node3 11389 1726854860.10250: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854860.10261: Calling all_plugins_play to load vars for managed_node3 11389 1726854860.10264: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854860.10266: Calling groups_plugins_play to load vars for managed_node3 11389 1726854860.11075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854860.11952: done with get_vars() 11389 1726854860.11976: done getting variables 11389 1726854860.12023: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:54:20 -0400 (0:00:00.124) 0:00:12.543 ****** 11389 1726854860.12049: entering _queue_task() for managed_node3/package 11389 1726854860.12306: worker is 1 (out of 1 available) 11389 1726854860.12320: exiting _queue_task() for managed_node3/package 11389 1726854860.12331: done queuing things up, now waiting for results queue to drain 11389 1726854860.12333: waiting for pending results... 11389 1726854860.12503: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11389 1726854860.12582: in run() - task 0affcc66-ac2b-deb8-c119-00000000002f 11389 1726854860.12595: variable 'ansible_search_path' from source: unknown 11389 1726854860.12599: variable 'ansible_search_path' from source: unknown 11389 1726854860.12629: calling self._execute() 11389 1726854860.12697: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854860.12701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854860.12709: variable 'omit' from source: magic vars 11389 1726854860.12966: variable 'ansible_distribution_major_version' from source: facts 11389 1726854860.12976: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854860.13060: variable 'network_state' from source: role '' defaults 11389 1726854860.13071: Evaluated conditional (network_state != {}): False 11389 1726854860.13074: when evaluation is False, skipping this task 11389 1726854860.13076: _execute() done 11389 1726854860.13079: dumping result to json 11389 1726854860.13081: done dumping result, returning 11389 1726854860.13088: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-deb8-c119-00000000002f] 11389 1726854860.13094: sending task result for task 0affcc66-ac2b-deb8-c119-00000000002f 11389 1726854860.13180: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000002f 11389 1726854860.13183: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11389 1726854860.13249: no more pending results, returning what we have 11389 1726854860.13252: results queue empty 11389 1726854860.13253: checking for any_errors_fatal 11389 1726854860.13256: done checking for any_errors_fatal 11389 1726854860.13257: checking for max_fail_percentage 11389 1726854860.13259: done checking for max_fail_percentage 11389 1726854860.13260: checking to see if all hosts have failed and the running result is not ok 11389 1726854860.13261: done checking to see if all hosts have failed 11389 1726854860.13262: getting the remaining hosts for this loop 11389 1726854860.13263: done getting the remaining hosts for this loop 11389 1726854860.13266: getting the next task for host managed_node3 11389 1726854860.13275: done getting next task for host managed_node3 11389 1726854860.13279: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11389 1726854860.13282: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854860.13297: getting variables 11389 1726854860.13299: in VariableManager get_vars() 11389 1726854860.13332: Calling all_inventory to load vars for managed_node3 11389 1726854860.13335: Calling groups_inventory to load vars for managed_node3 11389 1726854860.13337: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854860.13345: Calling all_plugins_play to load vars for managed_node3 11389 1726854860.13347: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854860.13349: Calling groups_plugins_play to load vars for managed_node3 11389 1726854860.14238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854860.18044: done with get_vars() 11389 1726854860.18067: done getting variables 11389 1726854860.18107: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:54:20 -0400 (0:00:00.060) 0:00:12.604 ****** 11389 1726854860.18128: entering _queue_task() for managed_node3/package 11389 1726854860.18383: worker is 1 (out of 1 available) 11389 1726854860.18399: exiting _queue_task() for managed_node3/package 11389 1726854860.18409: done queuing things up, now waiting for results queue to drain 11389 1726854860.18411: waiting for pending results... 11389 1726854860.18578: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11389 1726854860.18672: in run() - task 0affcc66-ac2b-deb8-c119-000000000030 11389 1726854860.18682: variable 'ansible_search_path' from source: unknown 11389 1726854860.18686: variable 'ansible_search_path' from source: unknown 11389 1726854860.18715: calling self._execute() 11389 1726854860.18781: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854860.18785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854860.18795: variable 'omit' from source: magic vars 11389 1726854860.19071: variable 'ansible_distribution_major_version' from source: facts 11389 1726854860.19083: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854860.19161: variable 'network_state' from source: role '' defaults 11389 1726854860.19172: Evaluated conditional (network_state != {}): False 11389 1726854860.19175: when evaluation is False, skipping this task 11389 1726854860.19177: _execute() done 11389 1726854860.19182: dumping result to json 11389 1726854860.19184: done dumping result, returning 11389 1726854860.19189: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-deb8-c119-000000000030] 11389 1726854860.19199: sending task result for task 0affcc66-ac2b-deb8-c119-000000000030 11389 1726854860.19285: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000030 11389 1726854860.19290: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11389 1726854860.19346: no more pending results, returning what we have 11389 1726854860.19350: results queue empty 11389 1726854860.19351: checking for any_errors_fatal 11389 1726854860.19358: done checking for any_errors_fatal 11389 1726854860.19359: checking for max_fail_percentage 11389 1726854860.19361: done checking for max_fail_percentage 11389 1726854860.19362: checking to see if all hosts have failed and the running result is not ok 11389 1726854860.19363: done checking to see if all hosts have failed 11389 1726854860.19363: getting the remaining hosts for this loop 11389 1726854860.19364: done getting the remaining hosts for this loop 11389 1726854860.19370: getting the next task for host managed_node3 11389 1726854860.19378: done getting next task for host managed_node3 11389 1726854860.19383: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11389 1726854860.19385: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854860.19403: getting variables 11389 1726854860.19404: in VariableManager get_vars() 11389 1726854860.19439: Calling all_inventory to load vars for managed_node3 11389 1726854860.19441: Calling groups_inventory to load vars for managed_node3 11389 1726854860.19443: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854860.19451: Calling all_plugins_play to load vars for managed_node3 11389 1726854860.19453: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854860.19455: Calling groups_plugins_play to load vars for managed_node3 11389 1726854860.20203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854860.21076: done with get_vars() 11389 1726854860.21093: done getting variables 11389 1726854860.21166: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:54:20 -0400 (0:00:00.030) 0:00:12.634 ****** 11389 1726854860.21192: entering _queue_task() for managed_node3/service 11389 1726854860.21194: Creating lock for service 11389 1726854860.21428: worker is 1 (out of 1 available) 11389 1726854860.21441: exiting _queue_task() for managed_node3/service 11389 1726854860.21452: done queuing things up, now waiting for results queue to drain 11389 1726854860.21454: waiting for pending results... 11389 1726854860.21619: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11389 1726854860.21698: in run() - task 0affcc66-ac2b-deb8-c119-000000000031 11389 1726854860.21710: variable 'ansible_search_path' from source: unknown 11389 1726854860.21713: variable 'ansible_search_path' from source: unknown 11389 1726854860.21742: calling self._execute() 11389 1726854860.21814: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854860.21817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854860.21827: variable 'omit' from source: magic vars 11389 1726854860.22092: variable 'ansible_distribution_major_version' from source: facts 11389 1726854860.22103: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854860.22183: variable '__network_wireless_connections_defined' from source: role '' defaults 11389 1726854860.22315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854860.23824: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854860.24201: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854860.24206: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854860.24241: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854860.24273: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854860.24354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854860.24390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854860.24423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854860.24696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854860.24699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854860.24702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854860.24704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854860.24706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854860.24710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854860.24712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854860.24714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854860.24716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854860.24722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854860.24763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854860.24780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854860.24945: variable 'network_connections' from source: task vars 11389 1726854860.24964: variable 'controller_profile' from source: play vars 11389 1726854860.25039: variable 'controller_profile' from source: play vars 11389 1726854860.25053: variable 'controller_device' from source: play vars 11389 1726854860.25118: variable 'controller_device' from source: play vars 11389 1726854860.25134: variable 'port1_profile' from source: play vars 11389 1726854860.25197: variable 'port1_profile' from source: play vars 11389 1726854860.25203: variable 'dhcp_interface1' from source: play vars 11389 1726854860.25267: variable 'dhcp_interface1' from source: play vars 11389 1726854860.25273: variable 'controller_profile' from source: play vars 11389 1726854860.25316: variable 'controller_profile' from source: play vars 11389 1726854860.25322: variable 'port2_profile' from source: play vars 11389 1726854860.25365: variable 'port2_profile' from source: play vars 11389 1726854860.25376: variable 'dhcp_interface2' from source: play vars 11389 1726854860.25423: variable 'dhcp_interface2' from source: play vars 11389 1726854860.25428: variable 'controller_profile' from source: play vars 11389 1726854860.25474: variable 'controller_profile' from source: play vars 11389 1726854860.25521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854860.25632: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854860.25659: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854860.25697: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854860.25719: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854860.25749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854860.25764: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854860.25792: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854860.25806: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854860.25854: variable '__network_team_connections_defined' from source: role '' defaults 11389 1726854860.26007: variable 'network_connections' from source: task vars 11389 1726854860.26011: variable 'controller_profile' from source: play vars 11389 1726854860.26049: variable 'controller_profile' from source: play vars 11389 1726854860.26055: variable 'controller_device' from source: play vars 11389 1726854860.26098: variable 'controller_device' from source: play vars 11389 1726854860.26107: variable 'port1_profile' from source: play vars 11389 1726854860.26151: variable 'port1_profile' from source: play vars 11389 1726854860.26157: variable 'dhcp_interface1' from source: play vars 11389 1726854860.26200: variable 'dhcp_interface1' from source: play vars 11389 1726854860.26206: variable 'controller_profile' from source: play vars 11389 1726854860.26250: variable 'controller_profile' from source: play vars 11389 1726854860.26256: variable 'port2_profile' from source: play vars 11389 1726854860.26300: variable 'port2_profile' from source: play vars 11389 1726854860.26306: variable 'dhcp_interface2' from source: play vars 11389 1726854860.26349: variable 'dhcp_interface2' from source: play vars 11389 1726854860.26355: variable 'controller_profile' from source: play vars 11389 1726854860.26398: variable 'controller_profile' from source: play vars 11389 1726854860.26422: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11389 1726854860.26425: when evaluation is False, skipping this task 11389 1726854860.26428: _execute() done 11389 1726854860.26430: dumping result to json 11389 1726854860.26432: done dumping result, returning 11389 1726854860.26445: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-deb8-c119-000000000031] 11389 1726854860.26448: sending task result for task 0affcc66-ac2b-deb8-c119-000000000031 11389 1726854860.26529: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000031 11389 1726854860.26532: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11389 1726854860.26592: no more pending results, returning what we have 11389 1726854860.26595: results queue empty 11389 1726854860.26595: checking for any_errors_fatal 11389 1726854860.26601: done checking for any_errors_fatal 11389 1726854860.26601: checking for max_fail_percentage 11389 1726854860.26603: done checking for max_fail_percentage 11389 1726854860.26604: checking to see if all hosts have failed and the running result is not ok 11389 1726854860.26605: done checking to see if all hosts have failed 11389 1726854860.26605: getting the remaining hosts for this loop 11389 1726854860.26606: done getting the remaining hosts for this loop 11389 1726854860.26610: getting the next task for host managed_node3 11389 1726854860.26616: done getting next task for host managed_node3 11389 1726854860.26619: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11389 1726854860.26622: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854860.26635: getting variables 11389 1726854860.26636: in VariableManager get_vars() 11389 1726854860.26678: Calling all_inventory to load vars for managed_node3 11389 1726854860.26681: Calling groups_inventory to load vars for managed_node3 11389 1726854860.26684: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854860.26694: Calling all_plugins_play to load vars for managed_node3 11389 1726854860.26697: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854860.26699: Calling groups_plugins_play to load vars for managed_node3 11389 1726854860.27926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854860.29451: done with get_vars() 11389 1726854860.29474: done getting variables 11389 1726854860.29535: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:54:20 -0400 (0:00:00.083) 0:00:12.718 ****** 11389 1726854860.29566: entering _queue_task() for managed_node3/service 11389 1726854860.29851: worker is 1 (out of 1 available) 11389 1726854860.29863: exiting _queue_task() for managed_node3/service 11389 1726854860.29873: done queuing things up, now waiting for results queue to drain 11389 1726854860.29875: waiting for pending results... 11389 1726854860.30303: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11389 1726854860.30308: in run() - task 0affcc66-ac2b-deb8-c119-000000000032 11389 1726854860.30311: variable 'ansible_search_path' from source: unknown 11389 1726854860.30313: variable 'ansible_search_path' from source: unknown 11389 1726854860.30326: calling self._execute() 11389 1726854860.30411: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854860.30424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854860.30441: variable 'omit' from source: magic vars 11389 1726854860.30866: variable 'ansible_distribution_major_version' from source: facts 11389 1726854860.30870: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854860.30981: variable 'network_provider' from source: set_fact 11389 1726854860.30993: variable 'network_state' from source: role '' defaults 11389 1726854860.31007: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11389 1726854860.31019: variable 'omit' from source: magic vars 11389 1726854860.31075: variable 'omit' from source: magic vars 11389 1726854860.31117: variable 'network_service_name' from source: role '' defaults 11389 1726854860.31190: variable 'network_service_name' from source: role '' defaults 11389 1726854860.31297: variable '__network_provider_setup' from source: role '' defaults 11389 1726854860.31315: variable '__network_service_name_default_nm' from source: role '' defaults 11389 1726854860.31379: variable '__network_service_name_default_nm' from source: role '' defaults 11389 1726854860.31415: variable '__network_packages_default_nm' from source: role '' defaults 11389 1726854860.31454: variable '__network_packages_default_nm' from source: role '' defaults 11389 1726854860.31665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854860.33808: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854860.33845: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854860.33890: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854860.33936: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854860.33964: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854860.34050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854860.34088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854860.34134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854860.34171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854860.34242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854860.34245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854860.34255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854860.34281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854860.34323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854860.34340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854860.34559: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11389 1726854860.34682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854860.34712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854860.34786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854860.34792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854860.34807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854860.34899: variable 'ansible_python' from source: facts 11389 1726854860.34924: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11389 1726854860.35010: variable '__network_wpa_supplicant_required' from source: role '' defaults 11389 1726854860.35092: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11389 1726854860.35223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854860.35252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854860.35328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854860.35331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854860.35343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854860.35393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854860.35428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854860.35464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854860.35508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854860.35526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854860.35893: variable 'network_connections' from source: task vars 11389 1726854860.35896: variable 'controller_profile' from source: play vars 11389 1726854860.35898: variable 'controller_profile' from source: play vars 11389 1726854860.35900: variable 'controller_device' from source: play vars 11389 1726854860.35902: variable 'controller_device' from source: play vars 11389 1726854860.35904: variable 'port1_profile' from source: play vars 11389 1726854860.35926: variable 'port1_profile' from source: play vars 11389 1726854860.35941: variable 'dhcp_interface1' from source: play vars 11389 1726854860.36019: variable 'dhcp_interface1' from source: play vars 11389 1726854860.36035: variable 'controller_profile' from source: play vars 11389 1726854860.36109: variable 'controller_profile' from source: play vars 11389 1726854860.36129: variable 'port2_profile' from source: play vars 11389 1726854860.36204: variable 'port2_profile' from source: play vars 11389 1726854860.36219: variable 'dhcp_interface2' from source: play vars 11389 1726854860.36300: variable 'dhcp_interface2' from source: play vars 11389 1726854860.36315: variable 'controller_profile' from source: play vars 11389 1726854860.36393: variable 'controller_profile' from source: play vars 11389 1726854860.36503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854860.36704: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854860.36755: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854860.36807: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854860.36855: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854860.36933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854860.36966: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854860.37007: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854860.37045: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854860.37103: variable '__network_wireless_connections_defined' from source: role '' defaults 11389 1726854860.37390: variable 'network_connections' from source: task vars 11389 1726854860.37403: variable 'controller_profile' from source: play vars 11389 1726854860.37484: variable 'controller_profile' from source: play vars 11389 1726854860.37504: variable 'controller_device' from source: play vars 11389 1726854860.37581: variable 'controller_device' from source: play vars 11389 1726854860.37603: variable 'port1_profile' from source: play vars 11389 1726854860.37756: variable 'port1_profile' from source: play vars 11389 1726854860.37759: variable 'dhcp_interface1' from source: play vars 11389 1726854860.37761: variable 'dhcp_interface1' from source: play vars 11389 1726854860.37766: variable 'controller_profile' from source: play vars 11389 1726854860.37832: variable 'controller_profile' from source: play vars 11389 1726854860.37848: variable 'port2_profile' from source: play vars 11389 1726854860.37931: variable 'port2_profile' from source: play vars 11389 1726854860.37948: variable 'dhcp_interface2' from source: play vars 11389 1726854860.38033: variable 'dhcp_interface2' from source: play vars 11389 1726854860.38049: variable 'controller_profile' from source: play vars 11389 1726854860.38131: variable 'controller_profile' from source: play vars 11389 1726854860.38190: variable '__network_packages_default_wireless' from source: role '' defaults 11389 1726854860.38276: variable '__network_wireless_connections_defined' from source: role '' defaults 11389 1726854860.38584: variable 'network_connections' from source: task vars 11389 1726854860.38598: variable 'controller_profile' from source: play vars 11389 1726854860.38741: variable 'controller_profile' from source: play vars 11389 1726854860.38745: variable 'controller_device' from source: play vars 11389 1726854860.38761: variable 'controller_device' from source: play vars 11389 1726854860.38776: variable 'port1_profile' from source: play vars 11389 1726854860.38853: variable 'port1_profile' from source: play vars 11389 1726854860.38866: variable 'dhcp_interface1' from source: play vars 11389 1726854860.38943: variable 'dhcp_interface1' from source: play vars 11389 1726854860.38959: variable 'controller_profile' from source: play vars 11389 1726854860.39033: variable 'controller_profile' from source: play vars 11389 1726854860.39046: variable 'port2_profile' from source: play vars 11389 1726854860.39124: variable 'port2_profile' from source: play vars 11389 1726854860.39136: variable 'dhcp_interface2' from source: play vars 11389 1726854860.39213: variable 'dhcp_interface2' from source: play vars 11389 1726854860.39285: variable 'controller_profile' from source: play vars 11389 1726854860.39301: variable 'controller_profile' from source: play vars 11389 1726854860.39331: variable '__network_packages_default_team' from source: role '' defaults 11389 1726854860.39416: variable '__network_team_connections_defined' from source: role '' defaults 11389 1726854860.39733: variable 'network_connections' from source: task vars 11389 1726854860.39744: variable 'controller_profile' from source: play vars 11389 1726854860.39816: variable 'controller_profile' from source: play vars 11389 1726854860.39833: variable 'controller_device' from source: play vars 11389 1726854860.39906: variable 'controller_device' from source: play vars 11389 1726854860.39919: variable 'port1_profile' from source: play vars 11389 1726854860.39995: variable 'port1_profile' from source: play vars 11389 1726854860.40008: variable 'dhcp_interface1' from source: play vars 11389 1726854860.40155: variable 'dhcp_interface1' from source: play vars 11389 1726854860.40159: variable 'controller_profile' from source: play vars 11389 1726854860.40166: variable 'controller_profile' from source: play vars 11389 1726854860.40178: variable 'port2_profile' from source: play vars 11389 1726854860.40248: variable 'port2_profile' from source: play vars 11389 1726854860.40266: variable 'dhcp_interface2' from source: play vars 11389 1726854860.40337: variable 'dhcp_interface2' from source: play vars 11389 1726854860.40348: variable 'controller_profile' from source: play vars 11389 1726854860.40422: variable 'controller_profile' from source: play vars 11389 1726854860.40495: variable '__network_service_name_default_initscripts' from source: role '' defaults 11389 1726854860.40555: variable '__network_service_name_default_initscripts' from source: role '' defaults 11389 1726854860.40566: variable '__network_packages_default_initscripts' from source: role '' defaults 11389 1726854860.40634: variable '__network_packages_default_initscripts' from source: role '' defaults 11389 1726854860.40911: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11389 1726854860.41316: variable 'network_connections' from source: task vars 11389 1726854860.41326: variable 'controller_profile' from source: play vars 11389 1726854860.41393: variable 'controller_profile' from source: play vars 11389 1726854860.41405: variable 'controller_device' from source: play vars 11389 1726854860.41469: variable 'controller_device' from source: play vars 11389 1726854860.41482: variable 'port1_profile' from source: play vars 11389 1726854860.41544: variable 'port1_profile' from source: play vars 11389 1726854860.41558: variable 'dhcp_interface1' from source: play vars 11389 1726854860.41624: variable 'dhcp_interface1' from source: play vars 11389 1726854860.41635: variable 'controller_profile' from source: play vars 11389 1726854860.41700: variable 'controller_profile' from source: play vars 11389 1726854860.41712: variable 'port2_profile' from source: play vars 11389 1726854860.41782: variable 'port2_profile' from source: play vars 11389 1726854860.41789: variable 'dhcp_interface2' from source: play vars 11389 1726854860.41891: variable 'dhcp_interface2' from source: play vars 11389 1726854860.41894: variable 'controller_profile' from source: play vars 11389 1726854860.41922: variable 'controller_profile' from source: play vars 11389 1726854860.41936: variable 'ansible_distribution' from source: facts 11389 1726854860.41944: variable '__network_rh_distros' from source: role '' defaults 11389 1726854860.41952: variable 'ansible_distribution_major_version' from source: facts 11389 1726854860.41980: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11389 1726854860.42162: variable 'ansible_distribution' from source: facts 11389 1726854860.42171: variable '__network_rh_distros' from source: role '' defaults 11389 1726854860.42215: variable 'ansible_distribution_major_version' from source: facts 11389 1726854860.42218: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11389 1726854860.42372: variable 'ansible_distribution' from source: facts 11389 1726854860.42381: variable '__network_rh_distros' from source: role '' defaults 11389 1726854860.42393: variable 'ansible_distribution_major_version' from source: facts 11389 1726854860.42435: variable 'network_provider' from source: set_fact 11389 1726854860.42461: variable 'omit' from source: magic vars 11389 1726854860.42539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854860.42543: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854860.42545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854860.42563: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854860.42577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854860.42610: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854860.42618: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854860.42625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854860.42727: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854860.42740: Set connection var ansible_timeout to 10 11389 1726854860.42746: Set connection var ansible_connection to ssh 11389 1726854860.42758: Set connection var ansible_shell_type to sh 11389 1726854860.42767: Set connection var ansible_pipelining to False 11389 1726854860.42992: Set connection var ansible_shell_executable to /bin/sh 11389 1726854860.42995: variable 'ansible_shell_executable' from source: unknown 11389 1726854860.42997: variable 'ansible_connection' from source: unknown 11389 1726854860.42999: variable 'ansible_module_compression' from source: unknown 11389 1726854860.43001: variable 'ansible_shell_type' from source: unknown 11389 1726854860.43003: variable 'ansible_shell_executable' from source: unknown 11389 1726854860.43005: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854860.43007: variable 'ansible_pipelining' from source: unknown 11389 1726854860.43009: variable 'ansible_timeout' from source: unknown 11389 1726854860.43011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854860.43013: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854860.43016: variable 'omit' from source: magic vars 11389 1726854860.43018: starting attempt loop 11389 1726854860.43020: running the handler 11389 1726854860.43037: variable 'ansible_facts' from source: unknown 11389 1726854860.43812: _low_level_execute_command(): starting 11389 1726854860.43825: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854860.44557: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854860.44594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854860.44607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854860.44694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854860.46389: stdout chunk (state=3): >>>/root <<< 11389 1726854860.46600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854860.46629: stderr chunk (state=3): >>><<< 11389 1726854860.46643: stdout chunk (state=3): >>><<< 11389 1726854860.46674: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854860.46703: _low_level_execute_command(): starting 11389 1726854860.46714: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854860.4669158-12060-107492260815531 `" && echo ansible-tmp-1726854860.4669158-12060-107492260815531="` echo /root/.ansible/tmp/ansible-tmp-1726854860.4669158-12060-107492260815531 `" ) && sleep 0' 11389 1726854860.47325: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854860.47347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854860.47361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854860.47377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854860.47399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854860.47413: stderr chunk (state=3): >>>debug2: match not found <<< 11389 1726854860.47454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854860.47526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854860.47542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854860.47568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854860.47658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854860.49597: stdout chunk (state=3): >>>ansible-tmp-1726854860.4669158-12060-107492260815531=/root/.ansible/tmp/ansible-tmp-1726854860.4669158-12060-107492260815531 <<< 11389 1726854860.49841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854860.49875: stdout chunk (state=3): >>><<< 11389 1726854860.49878: stderr chunk (state=3): >>><<< 11389 1726854860.49897: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854860.4669158-12060-107492260815531=/root/.ansible/tmp/ansible-tmp-1726854860.4669158-12060-107492260815531 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854860.50093: variable 'ansible_module_compression' from source: unknown 11389 1726854860.50098: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 11389 1726854860.50101: ANSIBALLZ: Acquiring lock 11389 1726854860.50103: ANSIBALLZ: Lock acquired: 140464425326096 11389 1726854860.50105: ANSIBALLZ: Creating module 11389 1726854860.77139: ANSIBALLZ: Writing module into payload 11389 1726854860.77246: ANSIBALLZ: Writing module 11389 1726854860.77266: ANSIBALLZ: Renaming module 11389 1726854860.77273: ANSIBALLZ: Done creating module 11389 1726854860.77291: variable 'ansible_facts' from source: unknown 11389 1726854860.77397: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854860.4669158-12060-107492260815531/AnsiballZ_systemd.py 11389 1726854860.77495: Sending initial data 11389 1726854860.77498: Sent initial data (156 bytes) 11389 1726854860.77935: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854860.77939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854860.77942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 11389 1726854860.77944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854860.77946: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854860.77997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854860.78001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854860.78006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854860.78066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854860.79759: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854860.79863: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854860.79940: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmpsg6i2lg4 /root/.ansible/tmp/ansible-tmp-1726854860.4669158-12060-107492260815531/AnsiballZ_systemd.py <<< 11389 1726854860.79942: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854860.4669158-12060-107492260815531/AnsiballZ_systemd.py" <<< 11389 1726854860.79998: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmpsg6i2lg4" to remote "/root/.ansible/tmp/ansible-tmp-1726854860.4669158-12060-107492260815531/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854860.4669158-12060-107492260815531/AnsiballZ_systemd.py" <<< 11389 1726854860.81247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854860.81250: stdout chunk (state=3): >>><<< 11389 1726854860.81292: stderr chunk (state=3): >>><<< 11389 1726854860.81314: done transferring module to remote 11389 1726854860.81324: _low_level_execute_command(): starting 11389 1726854860.81335: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854860.4669158-12060-107492260815531/ /root/.ansible/tmp/ansible-tmp-1726854860.4669158-12060-107492260815531/AnsiballZ_systemd.py && sleep 0' 11389 1726854860.82065: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854860.82069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854860.82085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854860.82132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854860.82148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854860.82262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854860.84129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854860.84133: stdout chunk (state=3): >>><<< 11389 1726854860.84139: stderr chunk (state=3): >>><<< 11389 1726854860.84158: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854860.84161: _low_level_execute_command(): starting 11389 1726854860.84245: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854860.4669158-12060-107492260815531/AnsiballZ_systemd.py && sleep 0' 11389 1726854860.85216: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854860.85220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854860.85223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854860.85225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854860.85302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854861.15118: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10342400", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3328118784", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "583632000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 11389 1726854861.15127: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.so<<< 11389 1726854861.15228: stdout chunk (state=3): >>>cket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11389 1726854861.17082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854861.17086: stdout chunk (state=3): >>><<< 11389 1726854861.17091: stderr chunk (state=3): >>><<< 11389 1726854861.17111: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10342400", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3328118784", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "583632000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854861.17400: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854860.4669158-12060-107492260815531/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854861.17405: _low_level_execute_command(): starting 11389 1726854861.17407: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854860.4669158-12060-107492260815531/ > /dev/null 2>&1 && sleep 0' 11389 1726854861.17933: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854861.17946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 11389 1726854861.17965: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854861.18007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854861.18020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854861.18174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854861.20052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854861.20098: stderr chunk (state=3): >>><<< 11389 1726854861.20127: stdout chunk (state=3): >>><<< 11389 1726854861.20130: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854861.20149: handler run complete 11389 1726854861.20185: attempt loop complete, returning result 11389 1726854861.20189: _execute() done 11389 1726854861.20191: dumping result to json 11389 1726854861.20207: done dumping result, returning 11389 1726854861.20215: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-deb8-c119-000000000032] 11389 1726854861.20218: sending task result for task 0affcc66-ac2b-deb8-c119-000000000032 11389 1726854861.20436: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000032 11389 1726854861.20439: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11389 1726854861.20485: no more pending results, returning what we have 11389 1726854861.20500: results queue empty 11389 1726854861.20501: checking for any_errors_fatal 11389 1726854861.20507: done checking for any_errors_fatal 11389 1726854861.20508: checking for max_fail_percentage 11389 1726854861.20510: done checking for max_fail_percentage 11389 1726854861.20510: checking to see if all hosts have failed and the running result is not ok 11389 1726854861.20511: done checking to see if all hosts have failed 11389 1726854861.20512: getting the remaining hosts for this loop 11389 1726854861.20513: done getting the remaining hosts for this loop 11389 1726854861.20517: getting the next task for host managed_node3 11389 1726854861.20522: done getting next task for host managed_node3 11389 1726854861.20526: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11389 1726854861.20528: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854861.20539: getting variables 11389 1726854861.20540: in VariableManager get_vars() 11389 1726854861.20577: Calling all_inventory to load vars for managed_node3 11389 1726854861.20579: Calling groups_inventory to load vars for managed_node3 11389 1726854861.20581: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854861.20593: Calling all_plugins_play to load vars for managed_node3 11389 1726854861.20595: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854861.20605: Calling groups_plugins_play to load vars for managed_node3 11389 1726854861.21386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854861.22345: done with get_vars() 11389 1726854861.22360: done getting variables 11389 1726854861.22406: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:54:21 -0400 (0:00:00.928) 0:00:13.647 ****** 11389 1726854861.22430: entering _queue_task() for managed_node3/service 11389 1726854861.22665: worker is 1 (out of 1 available) 11389 1726854861.22679: exiting _queue_task() for managed_node3/service 11389 1726854861.22692: done queuing things up, now waiting for results queue to drain 11389 1726854861.22694: waiting for pending results... 11389 1726854861.22861: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11389 1726854861.22952: in run() - task 0affcc66-ac2b-deb8-c119-000000000033 11389 1726854861.22963: variable 'ansible_search_path' from source: unknown 11389 1726854861.22966: variable 'ansible_search_path' from source: unknown 11389 1726854861.22998: calling self._execute() 11389 1726854861.23064: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854861.23070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854861.23079: variable 'omit' from source: magic vars 11389 1726854861.23347: variable 'ansible_distribution_major_version' from source: facts 11389 1726854861.23360: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854861.23437: variable 'network_provider' from source: set_fact 11389 1726854861.23441: Evaluated conditional (network_provider == "nm"): True 11389 1726854861.23509: variable '__network_wpa_supplicant_required' from source: role '' defaults 11389 1726854861.23573: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11389 1726854861.23682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854861.25071: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854861.25117: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854861.25144: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854861.25172: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854861.25191: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854861.25257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854861.25278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854861.25297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854861.25325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854861.25337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854861.25370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854861.25385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854861.25403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854861.25434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854861.25440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854861.25470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854861.25485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854861.25503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854861.25527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854861.25543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854861.25636: variable 'network_connections' from source: task vars 11389 1726854861.25651: variable 'controller_profile' from source: play vars 11389 1726854861.25695: variable 'controller_profile' from source: play vars 11389 1726854861.25703: variable 'controller_device' from source: play vars 11389 1726854861.25745: variable 'controller_device' from source: play vars 11389 1726854861.25752: variable 'port1_profile' from source: play vars 11389 1726854861.25800: variable 'port1_profile' from source: play vars 11389 1726854861.25806: variable 'dhcp_interface1' from source: play vars 11389 1726854861.25848: variable 'dhcp_interface1' from source: play vars 11389 1726854861.25854: variable 'controller_profile' from source: play vars 11389 1726854861.25903: variable 'controller_profile' from source: play vars 11389 1726854861.25909: variable 'port2_profile' from source: play vars 11389 1726854861.25950: variable 'port2_profile' from source: play vars 11389 1726854861.25956: variable 'dhcp_interface2' from source: play vars 11389 1726854861.26001: variable 'dhcp_interface2' from source: play vars 11389 1726854861.26007: variable 'controller_profile' from source: play vars 11389 1726854861.26047: variable 'controller_profile' from source: play vars 11389 1726854861.26100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854861.26204: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854861.26230: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854861.26251: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854861.26274: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854861.26308: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854861.26325: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854861.26341: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854861.26363: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854861.26401: variable '__network_wireless_connections_defined' from source: role '' defaults 11389 1726854861.26563: variable 'network_connections' from source: task vars 11389 1726854861.26566: variable 'controller_profile' from source: play vars 11389 1726854861.26610: variable 'controller_profile' from source: play vars 11389 1726854861.26616: variable 'controller_device' from source: play vars 11389 1726854861.26660: variable 'controller_device' from source: play vars 11389 1726854861.26670: variable 'port1_profile' from source: play vars 11389 1726854861.26711: variable 'port1_profile' from source: play vars 11389 1726854861.26716: variable 'dhcp_interface1' from source: play vars 11389 1726854861.26759: variable 'dhcp_interface1' from source: play vars 11389 1726854861.26765: variable 'controller_profile' from source: play vars 11389 1726854861.26808: variable 'controller_profile' from source: play vars 11389 1726854861.26813: variable 'port2_profile' from source: play vars 11389 1726854861.26856: variable 'port2_profile' from source: play vars 11389 1726854861.26862: variable 'dhcp_interface2' from source: play vars 11389 1726854861.26905: variable 'dhcp_interface2' from source: play vars 11389 1726854861.26910: variable 'controller_profile' from source: play vars 11389 1726854861.26952: variable 'controller_profile' from source: play vars 11389 1726854861.26983: Evaluated conditional (__network_wpa_supplicant_required): False 11389 1726854861.26986: when evaluation is False, skipping this task 11389 1726854861.26991: _execute() done 11389 1726854861.26993: dumping result to json 11389 1726854861.26996: done dumping result, returning 11389 1726854861.27002: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-deb8-c119-000000000033] 11389 1726854861.27007: sending task result for task 0affcc66-ac2b-deb8-c119-000000000033 11389 1726854861.27092: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000033 11389 1726854861.27094: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11389 1726854861.27140: no more pending results, returning what we have 11389 1726854861.27143: results queue empty 11389 1726854861.27144: checking for any_errors_fatal 11389 1726854861.27163: done checking for any_errors_fatal 11389 1726854861.27164: checking for max_fail_percentage 11389 1726854861.27165: done checking for max_fail_percentage 11389 1726854861.27166: checking to see if all hosts have failed and the running result is not ok 11389 1726854861.27167: done checking to see if all hosts have failed 11389 1726854861.27170: getting the remaining hosts for this loop 11389 1726854861.27171: done getting the remaining hosts for this loop 11389 1726854861.27175: getting the next task for host managed_node3 11389 1726854861.27181: done getting next task for host managed_node3 11389 1726854861.27185: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11389 1726854861.27189: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854861.27204: getting variables 11389 1726854861.27206: in VariableManager get_vars() 11389 1726854861.27243: Calling all_inventory to load vars for managed_node3 11389 1726854861.27245: Calling groups_inventory to load vars for managed_node3 11389 1726854861.27247: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854861.27256: Calling all_plugins_play to load vars for managed_node3 11389 1726854861.27258: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854861.27260: Calling groups_plugins_play to load vars for managed_node3 11389 1726854861.28032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854861.28903: done with get_vars() 11389 1726854861.28919: done getting variables 11389 1726854861.28959: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:54:21 -0400 (0:00:00.065) 0:00:13.712 ****** 11389 1726854861.28982: entering _queue_task() for managed_node3/service 11389 1726854861.29192: worker is 1 (out of 1 available) 11389 1726854861.29205: exiting _queue_task() for managed_node3/service 11389 1726854861.29216: done queuing things up, now waiting for results queue to drain 11389 1726854861.29218: waiting for pending results... 11389 1726854861.29375: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 11389 1726854861.29453: in run() - task 0affcc66-ac2b-deb8-c119-000000000034 11389 1726854861.29467: variable 'ansible_search_path' from source: unknown 11389 1726854861.29473: variable 'ansible_search_path' from source: unknown 11389 1726854861.29500: calling self._execute() 11389 1726854861.29570: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854861.29574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854861.29584: variable 'omit' from source: magic vars 11389 1726854861.29843: variable 'ansible_distribution_major_version' from source: facts 11389 1726854861.29852: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854861.29934: variable 'network_provider' from source: set_fact 11389 1726854861.29937: Evaluated conditional (network_provider == "initscripts"): False 11389 1726854861.29941: when evaluation is False, skipping this task 11389 1726854861.29943: _execute() done 11389 1726854861.29946: dumping result to json 11389 1726854861.29951: done dumping result, returning 11389 1726854861.29957: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-deb8-c119-000000000034] 11389 1726854861.29962: sending task result for task 0affcc66-ac2b-deb8-c119-000000000034 11389 1726854861.30045: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000034 11389 1726854861.30047: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11389 1726854861.30095: no more pending results, returning what we have 11389 1726854861.30099: results queue empty 11389 1726854861.30100: checking for any_errors_fatal 11389 1726854861.30108: done checking for any_errors_fatal 11389 1726854861.30109: checking for max_fail_percentage 11389 1726854861.30111: done checking for max_fail_percentage 11389 1726854861.30111: checking to see if all hosts have failed and the running result is not ok 11389 1726854861.30112: done checking to see if all hosts have failed 11389 1726854861.30113: getting the remaining hosts for this loop 11389 1726854861.30114: done getting the remaining hosts for this loop 11389 1726854861.30117: getting the next task for host managed_node3 11389 1726854861.30122: done getting next task for host managed_node3 11389 1726854861.30126: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11389 1726854861.30128: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854861.30140: getting variables 11389 1726854861.30142: in VariableManager get_vars() 11389 1726854861.30175: Calling all_inventory to load vars for managed_node3 11389 1726854861.30178: Calling groups_inventory to load vars for managed_node3 11389 1726854861.30180: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854861.30195: Calling all_plugins_play to load vars for managed_node3 11389 1726854861.30198: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854861.30201: Calling groups_plugins_play to load vars for managed_node3 11389 1726854861.31053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854861.31906: done with get_vars() 11389 1726854861.31920: done getting variables 11389 1726854861.31960: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:54:21 -0400 (0:00:00.029) 0:00:13.742 ****** 11389 1726854861.31984: entering _queue_task() for managed_node3/copy 11389 1726854861.32195: worker is 1 (out of 1 available) 11389 1726854861.32207: exiting _queue_task() for managed_node3/copy 11389 1726854861.32218: done queuing things up, now waiting for results queue to drain 11389 1726854861.32219: waiting for pending results... 11389 1726854861.32386: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11389 1726854861.32471: in run() - task 0affcc66-ac2b-deb8-c119-000000000035 11389 1726854861.32481: variable 'ansible_search_path' from source: unknown 11389 1726854861.32484: variable 'ansible_search_path' from source: unknown 11389 1726854861.32514: calling self._execute() 11389 1726854861.32582: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854861.32586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854861.32599: variable 'omit' from source: magic vars 11389 1726854861.32851: variable 'ansible_distribution_major_version' from source: facts 11389 1726854861.32859: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854861.32940: variable 'network_provider' from source: set_fact 11389 1726854861.32944: Evaluated conditional (network_provider == "initscripts"): False 11389 1726854861.32947: when evaluation is False, skipping this task 11389 1726854861.32950: _execute() done 11389 1726854861.32952: dumping result to json 11389 1726854861.32956: done dumping result, returning 11389 1726854861.32964: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-deb8-c119-000000000035] 11389 1726854861.32971: sending task result for task 0affcc66-ac2b-deb8-c119-000000000035 11389 1726854861.33049: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000035 11389 1726854861.33053: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11389 1726854861.33099: no more pending results, returning what we have 11389 1726854861.33102: results queue empty 11389 1726854861.33103: checking for any_errors_fatal 11389 1726854861.33108: done checking for any_errors_fatal 11389 1726854861.33109: checking for max_fail_percentage 11389 1726854861.33111: done checking for max_fail_percentage 11389 1726854861.33112: checking to see if all hosts have failed and the running result is not ok 11389 1726854861.33113: done checking to see if all hosts have failed 11389 1726854861.33113: getting the remaining hosts for this loop 11389 1726854861.33115: done getting the remaining hosts for this loop 11389 1726854861.33118: getting the next task for host managed_node3 11389 1726854861.33124: done getting next task for host managed_node3 11389 1726854861.33127: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11389 1726854861.33130: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854861.33143: getting variables 11389 1726854861.33145: in VariableManager get_vars() 11389 1726854861.33176: Calling all_inventory to load vars for managed_node3 11389 1726854861.33179: Calling groups_inventory to load vars for managed_node3 11389 1726854861.33181: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854861.33191: Calling all_plugins_play to load vars for managed_node3 11389 1726854861.33193: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854861.33196: Calling groups_plugins_play to load vars for managed_node3 11389 1726854861.33923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854861.34769: done with get_vars() 11389 1726854861.34783: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:54:21 -0400 (0:00:00.028) 0:00:13.771 ****** 11389 1726854861.34843: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11389 1726854861.34844: Creating lock for fedora.linux_system_roles.network_connections 11389 1726854861.35041: worker is 1 (out of 1 available) 11389 1726854861.35053: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11389 1726854861.35065: done queuing things up, now waiting for results queue to drain 11389 1726854861.35066: waiting for pending results... 11389 1726854861.35228: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11389 1726854861.35311: in run() - task 0affcc66-ac2b-deb8-c119-000000000036 11389 1726854861.35321: variable 'ansible_search_path' from source: unknown 11389 1726854861.35324: variable 'ansible_search_path' from source: unknown 11389 1726854861.35350: calling self._execute() 11389 1726854861.35419: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854861.35424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854861.35433: variable 'omit' from source: magic vars 11389 1726854861.35691: variable 'ansible_distribution_major_version' from source: facts 11389 1726854861.35700: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854861.35706: variable 'omit' from source: magic vars 11389 1726854861.35744: variable 'omit' from source: magic vars 11389 1726854861.35854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854861.37493: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854861.37536: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854861.37562: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854861.37592: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854861.37613: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854861.37667: variable 'network_provider' from source: set_fact 11389 1726854861.37760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854861.37782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854861.37805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854861.37830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854861.37841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854861.37894: variable 'omit' from source: magic vars 11389 1726854861.38008: variable 'omit' from source: magic vars 11389 1726854861.38058: variable 'network_connections' from source: task vars 11389 1726854861.38067: variable 'controller_profile' from source: play vars 11389 1726854861.38113: variable 'controller_profile' from source: play vars 11389 1726854861.38125: variable 'controller_device' from source: play vars 11389 1726854861.38164: variable 'controller_device' from source: play vars 11389 1726854861.38174: variable 'port1_profile' from source: play vars 11389 1726854861.38216: variable 'port1_profile' from source: play vars 11389 1726854861.38222: variable 'dhcp_interface1' from source: play vars 11389 1726854861.38267: variable 'dhcp_interface1' from source: play vars 11389 1726854861.38276: variable 'controller_profile' from source: play vars 11389 1726854861.38318: variable 'controller_profile' from source: play vars 11389 1726854861.38324: variable 'port2_profile' from source: play vars 11389 1726854861.38382: variable 'port2_profile' from source: play vars 11389 1726854861.38390: variable 'dhcp_interface2' from source: play vars 11389 1726854861.38431: variable 'dhcp_interface2' from source: play vars 11389 1726854861.38436: variable 'controller_profile' from source: play vars 11389 1726854861.38482: variable 'controller_profile' from source: play vars 11389 1726854861.38605: variable 'omit' from source: magic vars 11389 1726854861.38612: variable '__lsr_ansible_managed' from source: task vars 11389 1726854861.38653: variable '__lsr_ansible_managed' from source: task vars 11389 1726854861.38778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11389 1726854861.38913: Loaded config def from plugin (lookup/template) 11389 1726854861.38916: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11389 1726854861.38937: File lookup term: get_ansible_managed.j2 11389 1726854861.38940: variable 'ansible_search_path' from source: unknown 11389 1726854861.38943: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11389 1726854861.38954: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11389 1726854861.38967: variable 'ansible_search_path' from source: unknown 11389 1726854861.42793: variable 'ansible_managed' from source: unknown 11389 1726854861.42797: variable 'omit' from source: magic vars 11389 1726854861.42818: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854861.42848: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854861.42872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854861.42897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854861.42912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854861.42946: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854861.42955: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854861.42963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854861.43067: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854861.43085: Set connection var ansible_timeout to 10 11389 1726854861.43095: Set connection var ansible_connection to ssh 11389 1726854861.43103: Set connection var ansible_shell_type to sh 11389 1726854861.43111: Set connection var ansible_pipelining to False 11389 1726854861.43127: Set connection var ansible_shell_executable to /bin/sh 11389 1726854861.43149: variable 'ansible_shell_executable' from source: unknown 11389 1726854861.43154: variable 'ansible_connection' from source: unknown 11389 1726854861.43160: variable 'ansible_module_compression' from source: unknown 11389 1726854861.43164: variable 'ansible_shell_type' from source: unknown 11389 1726854861.43172: variable 'ansible_shell_executable' from source: unknown 11389 1726854861.43177: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854861.43183: variable 'ansible_pipelining' from source: unknown 11389 1726854861.43194: variable 'ansible_timeout' from source: unknown 11389 1726854861.43201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854861.43329: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11389 1726854861.43344: variable 'omit' from source: magic vars 11389 1726854861.43485: starting attempt loop 11389 1726854861.43492: running the handler 11389 1726854861.43494: _low_level_execute_command(): starting 11389 1726854861.43496: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854861.44154: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854861.44174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11389 1726854861.44190: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854861.44239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854861.44284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854861.44337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854861.46053: stdout chunk (state=3): >>>/root <<< 11389 1726854861.46191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854861.46204: stdout chunk (state=3): >>><<< 11389 1726854861.46217: stderr chunk (state=3): >>><<< 11389 1726854861.46244: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854861.46264: _low_level_execute_command(): starting 11389 1726854861.46274: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854861.4625092-12107-4675982310781 `" && echo ansible-tmp-1726854861.4625092-12107-4675982310781="` echo /root/.ansible/tmp/ansible-tmp-1726854861.4625092-12107-4675982310781 `" ) && sleep 0' 11389 1726854861.46897: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854861.47009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854861.47025: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854861.47059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854861.47145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854861.49166: stdout chunk (state=3): >>>ansible-tmp-1726854861.4625092-12107-4675982310781=/root/.ansible/tmp/ansible-tmp-1726854861.4625092-12107-4675982310781 <<< 11389 1726854861.49309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854861.49319: stdout chunk (state=3): >>><<< 11389 1726854861.49329: stderr chunk (state=3): >>><<< 11389 1726854861.49352: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854861.4625092-12107-4675982310781=/root/.ansible/tmp/ansible-tmp-1726854861.4625092-12107-4675982310781 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854861.49408: variable 'ansible_module_compression' from source: unknown 11389 1726854861.49467: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 11389 1726854861.49493: ANSIBALLZ: Acquiring lock 11389 1726854861.49496: ANSIBALLZ: Lock acquired: 140464423606448 11389 1726854861.49498: ANSIBALLZ: Creating module 11389 1726854861.77078: ANSIBALLZ: Writing module into payload 11389 1726854861.77374: ANSIBALLZ: Writing module 11389 1726854861.77407: ANSIBALLZ: Renaming module 11389 1726854861.77512: ANSIBALLZ: Done creating module 11389 1726854861.77515: variable 'ansible_facts' from source: unknown 11389 1726854861.77579: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854861.4625092-12107-4675982310781/AnsiballZ_network_connections.py 11389 1726854861.77748: Sending initial data 11389 1726854861.77757: Sent initial data (166 bytes) 11389 1726854861.78660: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854861.78708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854861.78730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854861.78810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854861.78838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854861.78946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854861.80592: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854861.80666: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854861.80730: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmpgg7phgwp /root/.ansible/tmp/ansible-tmp-1726854861.4625092-12107-4675982310781/AnsiballZ_network_connections.py <<< 11389 1726854861.80733: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854861.4625092-12107-4675982310781/AnsiballZ_network_connections.py" <<< 11389 1726854861.80797: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmpgg7phgwp" to remote "/root/.ansible/tmp/ansible-tmp-1726854861.4625092-12107-4675982310781/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854861.4625092-12107-4675982310781/AnsiballZ_network_connections.py" <<< 11389 1726854861.81998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854861.82001: stdout chunk (state=3): >>><<< 11389 1726854861.82003: stderr chunk (state=3): >>><<< 11389 1726854861.82005: done transferring module to remote 11389 1726854861.82008: _low_level_execute_command(): starting 11389 1726854861.82010: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854861.4625092-12107-4675982310781/ /root/.ansible/tmp/ansible-tmp-1726854861.4625092-12107-4675982310781/AnsiballZ_network_connections.py && sleep 0' 11389 1726854861.82500: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854861.82510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854861.82521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854861.82535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854861.82547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854861.82555: stderr chunk (state=3): >>>debug2: match not found <<< 11389 1726854861.82565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854861.82580: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11389 1726854861.82589: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 11389 1726854861.82671: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854861.82714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854861.82780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854861.84867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854861.84881: stdout chunk (state=3): >>><<< 11389 1726854861.84896: stderr chunk (state=3): >>><<< 11389 1726854861.84916: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854861.84928: _low_level_execute_command(): starting 11389 1726854861.84937: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854861.4625092-12107-4675982310781/AnsiballZ_network_connections.py && sleep 0' 11389 1726854861.85586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854861.85684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854861.85699: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854861.85712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854861.85812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854862.26394: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 7e031d4d-65d1-4bb1-88e6-3f83b0d194c1\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 4a57b26e-7a43-4b52-89b8-337f92ac6d2d\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 95858a33-49bf-46bb-9b62-ad79a5b8ae35\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 7e031d4d-65d1-4bb1-88e6-3f83b0d194c1 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 4a57b26e-7a43-4b52-89b8-337f92ac6d2d (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 95858a33-49bf-46bb-9b62-ad79a5b8ae35 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11389 1726854862.28658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854862.28688: stderr chunk (state=3): >>><<< 11389 1726854862.28692: stdout chunk (state=3): >>><<< 11389 1726854862.28713: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 7e031d4d-65d1-4bb1-88e6-3f83b0d194c1\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 4a57b26e-7a43-4b52-89b8-337f92ac6d2d\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 95858a33-49bf-46bb-9b62-ad79a5b8ae35\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 7e031d4d-65d1-4bb1-88e6-3f83b0d194c1 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 4a57b26e-7a43-4b52-89b8-337f92ac6d2d (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 95858a33-49bf-46bb-9b62-ad79a5b8ae35 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854862.28757: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854861.4625092-12107-4675982310781/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854862.28764: _low_level_execute_command(): starting 11389 1726854862.28772: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854861.4625092-12107-4675982310781/ > /dev/null 2>&1 && sleep 0' 11389 1726854862.29248: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854862.29252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854862.29259: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854862.29262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 11389 1726854862.29264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854862.29313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854862.29316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854862.29319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854862.29383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854862.31283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854862.31306: stderr chunk (state=3): >>><<< 11389 1726854862.31309: stdout chunk (state=3): >>><<< 11389 1726854862.31322: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854862.31328: handler run complete 11389 1726854862.31357: attempt loop complete, returning result 11389 1726854862.31360: _execute() done 11389 1726854862.31362: dumping result to json 11389 1726854862.31368: done dumping result, returning 11389 1726854862.31379: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-deb8-c119-000000000036] 11389 1726854862.31382: sending task result for task 0affcc66-ac2b-deb8-c119-000000000036 11389 1726854862.31495: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000036 11389 1726854862.31497: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 7e031d4d-65d1-4bb1-88e6-3f83b0d194c1 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 4a57b26e-7a43-4b52-89b8-337f92ac6d2d [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 95858a33-49bf-46bb-9b62-ad79a5b8ae35 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 7e031d4d-65d1-4bb1-88e6-3f83b0d194c1 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 4a57b26e-7a43-4b52-89b8-337f92ac6d2d (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 95858a33-49bf-46bb-9b62-ad79a5b8ae35 (not-active) 11389 1726854862.31631: no more pending results, returning what we have 11389 1726854862.31634: results queue empty 11389 1726854862.31635: checking for any_errors_fatal 11389 1726854862.31640: done checking for any_errors_fatal 11389 1726854862.31641: checking for max_fail_percentage 11389 1726854862.31642: done checking for max_fail_percentage 11389 1726854862.31643: checking to see if all hosts have failed and the running result is not ok 11389 1726854862.31644: done checking to see if all hosts have failed 11389 1726854862.31644: getting the remaining hosts for this loop 11389 1726854862.31646: done getting the remaining hosts for this loop 11389 1726854862.31649: getting the next task for host managed_node3 11389 1726854862.31654: done getting next task for host managed_node3 11389 1726854862.31657: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11389 1726854862.31660: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854862.31669: getting variables 11389 1726854862.31670: in VariableManager get_vars() 11389 1726854862.31708: Calling all_inventory to load vars for managed_node3 11389 1726854862.31710: Calling groups_inventory to load vars for managed_node3 11389 1726854862.31712: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854862.31728: Calling all_plugins_play to load vars for managed_node3 11389 1726854862.31731: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854862.31734: Calling groups_plugins_play to load vars for managed_node3 11389 1726854862.32639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854862.33955: done with get_vars() 11389 1726854862.33977: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:54:22 -0400 (0:00:00.992) 0:00:14.763 ****** 11389 1726854862.34060: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11389 1726854862.34062: Creating lock for fedora.linux_system_roles.network_state 11389 1726854862.34344: worker is 1 (out of 1 available) 11389 1726854862.34356: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11389 1726854862.34365: done queuing things up, now waiting for results queue to drain 11389 1726854862.34366: waiting for pending results... 11389 1726854862.34713: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 11389 1726854862.34797: in run() - task 0affcc66-ac2b-deb8-c119-000000000037 11389 1726854862.34800: variable 'ansible_search_path' from source: unknown 11389 1726854862.34803: variable 'ansible_search_path' from source: unknown 11389 1726854862.34834: calling self._execute() 11389 1726854862.34927: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854862.34938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854862.34992: variable 'omit' from source: magic vars 11389 1726854862.35324: variable 'ansible_distribution_major_version' from source: facts 11389 1726854862.35340: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854862.35470: variable 'network_state' from source: role '' defaults 11389 1726854862.35485: Evaluated conditional (network_state != {}): False 11389 1726854862.35495: when evaluation is False, skipping this task 11389 1726854862.35501: _execute() done 11389 1726854862.35507: dumping result to json 11389 1726854862.35568: done dumping result, returning 11389 1726854862.35571: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-deb8-c119-000000000037] 11389 1726854862.35574: sending task result for task 0affcc66-ac2b-deb8-c119-000000000037 11389 1726854862.35640: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000037 11389 1726854862.35643: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11389 1726854862.35695: no more pending results, returning what we have 11389 1726854862.35700: results queue empty 11389 1726854862.35701: checking for any_errors_fatal 11389 1726854862.35718: done checking for any_errors_fatal 11389 1726854862.35718: checking for max_fail_percentage 11389 1726854862.35720: done checking for max_fail_percentage 11389 1726854862.35721: checking to see if all hosts have failed and the running result is not ok 11389 1726854862.35722: done checking to see if all hosts have failed 11389 1726854862.35723: getting the remaining hosts for this loop 11389 1726854862.35724: done getting the remaining hosts for this loop 11389 1726854862.35728: getting the next task for host managed_node3 11389 1726854862.35735: done getting next task for host managed_node3 11389 1726854862.35738: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11389 1726854862.35742: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854862.35758: getting variables 11389 1726854862.35759: in VariableManager get_vars() 11389 1726854862.35800: Calling all_inventory to load vars for managed_node3 11389 1726854862.35803: Calling groups_inventory to load vars for managed_node3 11389 1726854862.35805: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854862.35817: Calling all_plugins_play to load vars for managed_node3 11389 1726854862.35819: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854862.35822: Calling groups_plugins_play to load vars for managed_node3 11389 1726854862.37410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854862.38802: done with get_vars() 11389 1726854862.38825: done getting variables 11389 1726854862.38883: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:54:22 -0400 (0:00:00.048) 0:00:14.812 ****** 11389 1726854862.38919: entering _queue_task() for managed_node3/debug 11389 1726854862.39232: worker is 1 (out of 1 available) 11389 1726854862.39243: exiting _queue_task() for managed_node3/debug 11389 1726854862.39255: done queuing things up, now waiting for results queue to drain 11389 1726854862.39256: waiting for pending results... 11389 1726854862.39614: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11389 1726854862.39711: in run() - task 0affcc66-ac2b-deb8-c119-000000000038 11389 1726854862.39715: variable 'ansible_search_path' from source: unknown 11389 1726854862.39718: variable 'ansible_search_path' from source: unknown 11389 1726854862.39746: calling self._execute() 11389 1726854862.39844: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854862.40094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854862.40098: variable 'omit' from source: magic vars 11389 1726854862.40244: variable 'ansible_distribution_major_version' from source: facts 11389 1726854862.40263: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854862.40274: variable 'omit' from source: magic vars 11389 1726854862.40339: variable 'omit' from source: magic vars 11389 1726854862.40382: variable 'omit' from source: magic vars 11389 1726854862.40437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854862.40481: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854862.40510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854862.40538: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854862.40562: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854862.40601: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854862.40611: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854862.40619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854862.40732: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854862.40749: Set connection var ansible_timeout to 10 11389 1726854862.40760: Set connection var ansible_connection to ssh 11389 1726854862.40775: Set connection var ansible_shell_type to sh 11389 1726854862.40790: Set connection var ansible_pipelining to False 11389 1726854862.40802: Set connection var ansible_shell_executable to /bin/sh 11389 1726854862.40831: variable 'ansible_shell_executable' from source: unknown 11389 1726854862.40840: variable 'ansible_connection' from source: unknown 11389 1726854862.40849: variable 'ansible_module_compression' from source: unknown 11389 1726854862.40856: variable 'ansible_shell_type' from source: unknown 11389 1726854862.40863: variable 'ansible_shell_executable' from source: unknown 11389 1726854862.40872: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854862.40884: variable 'ansible_pipelining' from source: unknown 11389 1726854862.40892: variable 'ansible_timeout' from source: unknown 11389 1726854862.40900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854862.41033: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854862.41094: variable 'omit' from source: magic vars 11389 1726854862.41097: starting attempt loop 11389 1726854862.41100: running the handler 11389 1726854862.41190: variable '__network_connections_result' from source: set_fact 11389 1726854862.41257: handler run complete 11389 1726854862.41279: attempt loop complete, returning result 11389 1726854862.41289: _execute() done 11389 1726854862.41297: dumping result to json 11389 1726854862.41310: done dumping result, returning 11389 1726854862.41392: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-deb8-c119-000000000038] 11389 1726854862.41395: sending task result for task 0affcc66-ac2b-deb8-c119-000000000038 11389 1726854862.41693: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000038 11389 1726854862.41697: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 7e031d4d-65d1-4bb1-88e6-3f83b0d194c1", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 4a57b26e-7a43-4b52-89b8-337f92ac6d2d", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 95858a33-49bf-46bb-9b62-ad79a5b8ae35", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 7e031d4d-65d1-4bb1-88e6-3f83b0d194c1 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 4a57b26e-7a43-4b52-89b8-337f92ac6d2d (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 95858a33-49bf-46bb-9b62-ad79a5b8ae35 (not-active)" ] } 11389 1726854862.41757: no more pending results, returning what we have 11389 1726854862.41760: results queue empty 11389 1726854862.41761: checking for any_errors_fatal 11389 1726854862.41765: done checking for any_errors_fatal 11389 1726854862.41766: checking for max_fail_percentage 11389 1726854862.41768: done checking for max_fail_percentage 11389 1726854862.41768: checking to see if all hosts have failed and the running result is not ok 11389 1726854862.41769: done checking to see if all hosts have failed 11389 1726854862.41770: getting the remaining hosts for this loop 11389 1726854862.41771: done getting the remaining hosts for this loop 11389 1726854862.41775: getting the next task for host managed_node3 11389 1726854862.41782: done getting next task for host managed_node3 11389 1726854862.41786: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11389 1726854862.41792: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854862.41804: getting variables 11389 1726854862.41805: in VariableManager get_vars() 11389 1726854862.41845: Calling all_inventory to load vars for managed_node3 11389 1726854862.41848: Calling groups_inventory to load vars for managed_node3 11389 1726854862.41851: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854862.41861: Calling all_plugins_play to load vars for managed_node3 11389 1726854862.41864: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854862.41867: Calling groups_plugins_play to load vars for managed_node3 11389 1726854862.43276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854862.44790: done with get_vars() 11389 1726854862.44817: done getting variables 11389 1726854862.44876: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:54:22 -0400 (0:00:00.059) 0:00:14.872 ****** 11389 1726854862.44917: entering _queue_task() for managed_node3/debug 11389 1726854862.45228: worker is 1 (out of 1 available) 11389 1726854862.45241: exiting _queue_task() for managed_node3/debug 11389 1726854862.45252: done queuing things up, now waiting for results queue to drain 11389 1726854862.45253: waiting for pending results... 11389 1726854862.45525: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11389 1726854862.45658: in run() - task 0affcc66-ac2b-deb8-c119-000000000039 11389 1726854862.45679: variable 'ansible_search_path' from source: unknown 11389 1726854862.45688: variable 'ansible_search_path' from source: unknown 11389 1726854862.45733: calling self._execute() 11389 1726854862.45825: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854862.45836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854862.45850: variable 'omit' from source: magic vars 11389 1726854862.46211: variable 'ansible_distribution_major_version' from source: facts 11389 1726854862.46228: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854862.46240: variable 'omit' from source: magic vars 11389 1726854862.46302: variable 'omit' from source: magic vars 11389 1726854862.46342: variable 'omit' from source: magic vars 11389 1726854862.46390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854862.46429: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854862.46455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854862.46482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854862.46502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854862.46537: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854862.46546: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854862.46576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854862.46662: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854862.46676: Set connection var ansible_timeout to 10 11389 1726854862.46691: Set connection var ansible_connection to ssh 11389 1726854862.46892: Set connection var ansible_shell_type to sh 11389 1726854862.46894: Set connection var ansible_pipelining to False 11389 1726854862.46896: Set connection var ansible_shell_executable to /bin/sh 11389 1726854862.46898: variable 'ansible_shell_executable' from source: unknown 11389 1726854862.46899: variable 'ansible_connection' from source: unknown 11389 1726854862.46902: variable 'ansible_module_compression' from source: unknown 11389 1726854862.46903: variable 'ansible_shell_type' from source: unknown 11389 1726854862.46905: variable 'ansible_shell_executable' from source: unknown 11389 1726854862.46906: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854862.46908: variable 'ansible_pipelining' from source: unknown 11389 1726854862.46909: variable 'ansible_timeout' from source: unknown 11389 1726854862.46911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854862.46913: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854862.46916: variable 'omit' from source: magic vars 11389 1726854862.46917: starting attempt loop 11389 1726854862.46919: running the handler 11389 1726854862.46948: variable '__network_connections_result' from source: set_fact 11389 1726854862.47041: variable '__network_connections_result' from source: set_fact 11389 1726854862.47224: handler run complete 11389 1726854862.47265: attempt loop complete, returning result 11389 1726854862.47272: _execute() done 11389 1726854862.47280: dumping result to json 11389 1726854862.47292: done dumping result, returning 11389 1726854862.47305: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-deb8-c119-000000000039] 11389 1726854862.47316: sending task result for task 0affcc66-ac2b-deb8-c119-000000000039 ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 7e031d4d-65d1-4bb1-88e6-3f83b0d194c1\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 4a57b26e-7a43-4b52-89b8-337f92ac6d2d\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 95858a33-49bf-46bb-9b62-ad79a5b8ae35\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 7e031d4d-65d1-4bb1-88e6-3f83b0d194c1 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 4a57b26e-7a43-4b52-89b8-337f92ac6d2d (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 95858a33-49bf-46bb-9b62-ad79a5b8ae35 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 7e031d4d-65d1-4bb1-88e6-3f83b0d194c1", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 4a57b26e-7a43-4b52-89b8-337f92ac6d2d", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 95858a33-49bf-46bb-9b62-ad79a5b8ae35", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 7e031d4d-65d1-4bb1-88e6-3f83b0d194c1 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 4a57b26e-7a43-4b52-89b8-337f92ac6d2d (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 95858a33-49bf-46bb-9b62-ad79a5b8ae35 (not-active)" ] } } 11389 1726854862.47579: no more pending results, returning what we have 11389 1726854862.47583: results queue empty 11389 1726854862.47584: checking for any_errors_fatal 11389 1726854862.47595: done checking for any_errors_fatal 11389 1726854862.47596: checking for max_fail_percentage 11389 1726854862.47605: done checking for max_fail_percentage 11389 1726854862.47605: checking to see if all hosts have failed and the running result is not ok 11389 1726854862.47606: done checking to see if all hosts have failed 11389 1726854862.47607: getting the remaining hosts for this loop 11389 1726854862.47609: done getting the remaining hosts for this loop 11389 1726854862.47613: getting the next task for host managed_node3 11389 1726854862.47620: done getting next task for host managed_node3 11389 1726854862.47624: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11389 1726854862.47628: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854862.47639: getting variables 11389 1726854862.47641: in VariableManager get_vars() 11389 1726854862.47681: Calling all_inventory to load vars for managed_node3 11389 1726854862.47685: Calling groups_inventory to load vars for managed_node3 11389 1726854862.47980: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854862.47991: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000039 11389 1726854862.47994: WORKER PROCESS EXITING 11389 1726854862.48004: Calling all_plugins_play to load vars for managed_node3 11389 1726854862.48007: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854862.48010: Calling groups_plugins_play to load vars for managed_node3 11389 1726854862.49254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854862.50572: done with get_vars() 11389 1726854862.50591: done getting variables 11389 1726854862.50633: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:54:22 -0400 (0:00:00.057) 0:00:14.929 ****** 11389 1726854862.50657: entering _queue_task() for managed_node3/debug 11389 1726854862.50878: worker is 1 (out of 1 available) 11389 1726854862.50893: exiting _queue_task() for managed_node3/debug 11389 1726854862.50904: done queuing things up, now waiting for results queue to drain 11389 1726854862.50906: waiting for pending results... 11389 1726854862.51082: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11389 1726854862.51171: in run() - task 0affcc66-ac2b-deb8-c119-00000000003a 11389 1726854862.51185: variable 'ansible_search_path' from source: unknown 11389 1726854862.51190: variable 'ansible_search_path' from source: unknown 11389 1726854862.51217: calling self._execute() 11389 1726854862.51286: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854862.51292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854862.51302: variable 'omit' from source: magic vars 11389 1726854862.51571: variable 'ansible_distribution_major_version' from source: facts 11389 1726854862.51580: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854862.51659: variable 'network_state' from source: role '' defaults 11389 1726854862.51667: Evaluated conditional (network_state != {}): False 11389 1726854862.51672: when evaluation is False, skipping this task 11389 1726854862.51675: _execute() done 11389 1726854862.51678: dumping result to json 11389 1726854862.51686: done dumping result, returning 11389 1726854862.51691: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-deb8-c119-00000000003a] 11389 1726854862.51792: sending task result for task 0affcc66-ac2b-deb8-c119-00000000003a 11389 1726854862.51863: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000003a 11389 1726854862.51867: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 11389 1726854862.52006: no more pending results, returning what we have 11389 1726854862.52010: results queue empty 11389 1726854862.52011: checking for any_errors_fatal 11389 1726854862.52017: done checking for any_errors_fatal 11389 1726854862.52018: checking for max_fail_percentage 11389 1726854862.52020: done checking for max_fail_percentage 11389 1726854862.52020: checking to see if all hosts have failed and the running result is not ok 11389 1726854862.52021: done checking to see if all hosts have failed 11389 1726854862.52022: getting the remaining hosts for this loop 11389 1726854862.52023: done getting the remaining hosts for this loop 11389 1726854862.52027: getting the next task for host managed_node3 11389 1726854862.52032: done getting next task for host managed_node3 11389 1726854862.52036: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11389 1726854862.52038: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854862.52051: getting variables 11389 1726854862.52053: in VariableManager get_vars() 11389 1726854862.52086: Calling all_inventory to load vars for managed_node3 11389 1726854862.52091: Calling groups_inventory to load vars for managed_node3 11389 1726854862.52093: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854862.52101: Calling all_plugins_play to load vars for managed_node3 11389 1726854862.52104: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854862.52107: Calling groups_plugins_play to load vars for managed_node3 11389 1726854862.53185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854862.54049: done with get_vars() 11389 1726854862.54065: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:54:22 -0400 (0:00:00.034) 0:00:14.964 ****** 11389 1726854862.54134: entering _queue_task() for managed_node3/ping 11389 1726854862.54136: Creating lock for ping 11389 1726854862.54360: worker is 1 (out of 1 available) 11389 1726854862.54377: exiting _queue_task() for managed_node3/ping 11389 1726854862.54390: done queuing things up, now waiting for results queue to drain 11389 1726854862.54392: waiting for pending results... 11389 1726854862.54554: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 11389 1726854862.54639: in run() - task 0affcc66-ac2b-deb8-c119-00000000003b 11389 1726854862.54651: variable 'ansible_search_path' from source: unknown 11389 1726854862.54654: variable 'ansible_search_path' from source: unknown 11389 1726854862.54684: calling self._execute() 11389 1726854862.54753: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854862.54756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854862.54767: variable 'omit' from source: magic vars 11389 1726854862.55072: variable 'ansible_distribution_major_version' from source: facts 11389 1726854862.55194: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854862.55197: variable 'omit' from source: magic vars 11389 1726854862.55200: variable 'omit' from source: magic vars 11389 1726854862.55203: variable 'omit' from source: magic vars 11389 1726854862.55235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854862.55271: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854862.55297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854862.55321: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854862.55337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854862.55372: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854862.55381: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854862.55391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854862.55494: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854862.55509: Set connection var ansible_timeout to 10 11389 1726854862.55516: Set connection var ansible_connection to ssh 11389 1726854862.55526: Set connection var ansible_shell_type to sh 11389 1726854862.55536: Set connection var ansible_pipelining to False 11389 1726854862.55545: Set connection var ansible_shell_executable to /bin/sh 11389 1726854862.55570: variable 'ansible_shell_executable' from source: unknown 11389 1726854862.55578: variable 'ansible_connection' from source: unknown 11389 1726854862.55586: variable 'ansible_module_compression' from source: unknown 11389 1726854862.55597: variable 'ansible_shell_type' from source: unknown 11389 1726854862.55613: variable 'ansible_shell_executable' from source: unknown 11389 1726854862.55616: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854862.55618: variable 'ansible_pipelining' from source: unknown 11389 1726854862.55620: variable 'ansible_timeout' from source: unknown 11389 1726854862.55622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854862.55812: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11389 1726854862.55859: variable 'omit' from source: magic vars 11389 1726854862.55862: starting attempt loop 11389 1726854862.55865: running the handler 11389 1726854862.55870: _low_level_execute_command(): starting 11389 1726854862.55872: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854862.56497: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854862.56556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854862.56561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854862.56579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854862.56589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854862.56673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854862.58371: stdout chunk (state=3): >>>/root <<< 11389 1726854862.58576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854862.58579: stdout chunk (state=3): >>><<< 11389 1726854862.58581: stderr chunk (state=3): >>><<< 11389 1726854862.58614: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854862.58717: _low_level_execute_command(): starting 11389 1726854862.58722: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854862.586212-12157-223151145552625 `" && echo ansible-tmp-1726854862.586212-12157-223151145552625="` echo /root/.ansible/tmp/ansible-tmp-1726854862.586212-12157-223151145552625 `" ) && sleep 0' 11389 1726854862.59261: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854862.59334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854862.59382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854862.61299: stdout chunk (state=3): >>>ansible-tmp-1726854862.586212-12157-223151145552625=/root/.ansible/tmp/ansible-tmp-1726854862.586212-12157-223151145552625 <<< 11389 1726854862.61404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854862.61432: stderr chunk (state=3): >>><<< 11389 1726854862.61435: stdout chunk (state=3): >>><<< 11389 1726854862.61453: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854862.586212-12157-223151145552625=/root/.ansible/tmp/ansible-tmp-1726854862.586212-12157-223151145552625 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854862.61494: variable 'ansible_module_compression' from source: unknown 11389 1726854862.61532: ANSIBALLZ: Using lock for ping 11389 1726854862.61535: ANSIBALLZ: Acquiring lock 11389 1726854862.61538: ANSIBALLZ: Lock acquired: 140464420482096 11389 1726854862.61540: ANSIBALLZ: Creating module 11389 1726854862.69199: ANSIBALLZ: Writing module into payload 11389 1726854862.69240: ANSIBALLZ: Writing module 11389 1726854862.69259: ANSIBALLZ: Renaming module 11389 1726854862.69264: ANSIBALLZ: Done creating module 11389 1726854862.69280: variable 'ansible_facts' from source: unknown 11389 1726854862.69323: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854862.586212-12157-223151145552625/AnsiballZ_ping.py 11389 1726854862.69427: Sending initial data 11389 1726854862.69430: Sent initial data (152 bytes) 11389 1726854862.69898: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854862.69901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854862.69904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854862.69906: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854862.69908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854862.69911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854862.69959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854862.69962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854862.69964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854862.70033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854862.71740: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 11389 1726854862.71747: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854862.71796: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854862.71854: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmphww_bw9p /root/.ansible/tmp/ansible-tmp-1726854862.586212-12157-223151145552625/AnsiballZ_ping.py <<< 11389 1726854862.71860: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854862.586212-12157-223151145552625/AnsiballZ_ping.py" <<< 11389 1726854862.71908: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmphww_bw9p" to remote "/root/.ansible/tmp/ansible-tmp-1726854862.586212-12157-223151145552625/AnsiballZ_ping.py" <<< 11389 1726854862.71913: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854862.586212-12157-223151145552625/AnsiballZ_ping.py" <<< 11389 1726854862.72483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854862.72526: stderr chunk (state=3): >>><<< 11389 1726854862.72529: stdout chunk (state=3): >>><<< 11389 1726854862.72558: done transferring module to remote 11389 1726854862.72567: _low_level_execute_command(): starting 11389 1726854862.72572: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854862.586212-12157-223151145552625/ /root/.ansible/tmp/ansible-tmp-1726854862.586212-12157-223151145552625/AnsiballZ_ping.py && sleep 0' 11389 1726854862.72992: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854862.73023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854862.73026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854862.73028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854862.73030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854862.73032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854862.73090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854862.73094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854862.73098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854862.73155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854862.74904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854862.74935: stderr chunk (state=3): >>><<< 11389 1726854862.74938: stdout chunk (state=3): >>><<< 11389 1726854862.74949: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854862.74952: _low_level_execute_command(): starting 11389 1726854862.74957: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854862.586212-12157-223151145552625/AnsiballZ_ping.py && sleep 0' 11389 1726854862.75368: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854862.75405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854862.75408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854862.75410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854862.75412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854862.75460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854862.75466: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854862.75470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854862.75607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854862.90911: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11389 1726854862.92356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854862.92360: stdout chunk (state=3): >>><<< 11389 1726854862.92362: stderr chunk (state=3): >>><<< 11389 1726854862.92385: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854862.92494: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854862.586212-12157-223151145552625/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854862.92498: _low_level_execute_command(): starting 11389 1726854862.92500: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854862.586212-12157-223151145552625/ > /dev/null 2>&1 && sleep 0' 11389 1726854862.93050: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854862.93072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854862.93091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854862.93114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854862.93134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854862.93149: stderr chunk (state=3): >>>debug2: match not found <<< 11389 1726854862.93165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854862.93265: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854862.93289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854862.93328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854862.93385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854862.95301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854862.95313: stdout chunk (state=3): >>><<< 11389 1726854862.95343: stderr chunk (state=3): >>><<< 11389 1726854862.95372: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854862.95390: handler run complete 11389 1726854862.95411: attempt loop complete, returning result 11389 1726854862.95419: _execute() done 11389 1726854862.95426: dumping result to json 11389 1726854862.95434: done dumping result, returning 11389 1726854862.95447: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-deb8-c119-00000000003b] 11389 1726854862.95599: sending task result for task 0affcc66-ac2b-deb8-c119-00000000003b 11389 1726854862.95668: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000003b 11389 1726854862.95671: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 11389 1726854862.95762: no more pending results, returning what we have 11389 1726854862.95766: results queue empty 11389 1726854862.95767: checking for any_errors_fatal 11389 1726854862.95773: done checking for any_errors_fatal 11389 1726854862.95774: checking for max_fail_percentage 11389 1726854862.95776: done checking for max_fail_percentage 11389 1726854862.95777: checking to see if all hosts have failed and the running result is not ok 11389 1726854862.95778: done checking to see if all hosts have failed 11389 1726854862.95778: getting the remaining hosts for this loop 11389 1726854862.95780: done getting the remaining hosts for this loop 11389 1726854862.95784: getting the next task for host managed_node3 11389 1726854862.95924: done getting next task for host managed_node3 11389 1726854862.95926: ^ task is: TASK: meta (role_complete) 11389 1726854862.95929: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854862.95938: getting variables 11389 1726854862.95940: in VariableManager get_vars() 11389 1726854862.95978: Calling all_inventory to load vars for managed_node3 11389 1726854862.95980: Calling groups_inventory to load vars for managed_node3 11389 1726854862.95982: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854862.96198: Calling all_plugins_play to load vars for managed_node3 11389 1726854862.96202: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854862.96205: Calling groups_plugins_play to load vars for managed_node3 11389 1726854862.97668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854862.99280: done with get_vars() 11389 1726854862.99307: done getting variables 11389 1726854862.99399: done queuing things up, now waiting for results queue to drain 11389 1726854862.99401: results queue empty 11389 1726854862.99402: checking for any_errors_fatal 11389 1726854862.99405: done checking for any_errors_fatal 11389 1726854862.99406: checking for max_fail_percentage 11389 1726854862.99407: done checking for max_fail_percentage 11389 1726854862.99408: checking to see if all hosts have failed and the running result is not ok 11389 1726854862.99409: done checking to see if all hosts have failed 11389 1726854862.99409: getting the remaining hosts for this loop 11389 1726854862.99410: done getting the remaining hosts for this loop 11389 1726854862.99413: getting the next task for host managed_node3 11389 1726854862.99417: done getting next task for host managed_node3 11389 1726854862.99420: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11389 1726854862.99422: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854862.99424: getting variables 11389 1726854862.99425: in VariableManager get_vars() 11389 1726854862.99444: Calling all_inventory to load vars for managed_node3 11389 1726854862.99447: Calling groups_inventory to load vars for managed_node3 11389 1726854862.99449: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854862.99454: Calling all_plugins_play to load vars for managed_node3 11389 1726854862.99458: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854862.99461: Calling groups_plugins_play to load vars for managed_node3 11389 1726854863.00635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854863.02326: done with get_vars() 11389 1726854863.02349: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:54:23 -0400 (0:00:00.482) 0:00:15.447 ****** 11389 1726854863.02433: entering _queue_task() for managed_node3/include_tasks 11389 1726854863.02763: worker is 1 (out of 1 available) 11389 1726854863.02775: exiting _queue_task() for managed_node3/include_tasks 11389 1726854863.02786: done queuing things up, now waiting for results queue to drain 11389 1726854863.02791: waiting for pending results... 11389 1726854863.03078: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 11389 1726854863.03201: in run() - task 0affcc66-ac2b-deb8-c119-00000000006e 11389 1726854863.03230: variable 'ansible_search_path' from source: unknown 11389 1726854863.03239: variable 'ansible_search_path' from source: unknown 11389 1726854863.03281: calling self._execute() 11389 1726854863.03394: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854863.03407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854863.03422: variable 'omit' from source: magic vars 11389 1726854863.03869: variable 'ansible_distribution_major_version' from source: facts 11389 1726854863.03873: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854863.03876: _execute() done 11389 1726854863.03879: dumping result to json 11389 1726854863.03882: done dumping result, returning 11389 1726854863.03884: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0affcc66-ac2b-deb8-c119-00000000006e] 11389 1726854863.03887: sending task result for task 0affcc66-ac2b-deb8-c119-00000000006e 11389 1726854863.04112: no more pending results, returning what we have 11389 1726854863.04118: in VariableManager get_vars() 11389 1726854863.04166: Calling all_inventory to load vars for managed_node3 11389 1726854863.04168: Calling groups_inventory to load vars for managed_node3 11389 1726854863.04171: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854863.04185: Calling all_plugins_play to load vars for managed_node3 11389 1726854863.04189: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854863.04193: Calling groups_plugins_play to load vars for managed_node3 11389 1726854863.04801: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000006e 11389 1726854863.04804: WORKER PROCESS EXITING 11389 1726854863.05862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854863.07537: done with get_vars() 11389 1726854863.07819: variable 'ansible_search_path' from source: unknown 11389 1726854863.07821: variable 'ansible_search_path' from source: unknown 11389 1726854863.07863: we have included files to process 11389 1726854863.07865: generating all_blocks data 11389 1726854863.07867: done generating all_blocks data 11389 1726854863.07872: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11389 1726854863.07873: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11389 1726854863.07876: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11389 1726854863.08183: done processing included file 11389 1726854863.08186: iterating over new_blocks loaded from include file 11389 1726854863.08191: in VariableManager get_vars() 11389 1726854863.08218: done with get_vars() 11389 1726854863.08220: filtering new block on tags 11389 1726854863.08239: done filtering new block on tags 11389 1726854863.08241: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 11389 1726854863.08246: extending task lists for all hosts with included blocks 11389 1726854863.08354: done extending task lists 11389 1726854863.08356: done processing included files 11389 1726854863.08356: results queue empty 11389 1726854863.08357: checking for any_errors_fatal 11389 1726854863.08359: done checking for any_errors_fatal 11389 1726854863.08359: checking for max_fail_percentage 11389 1726854863.08361: done checking for max_fail_percentage 11389 1726854863.08362: checking to see if all hosts have failed and the running result is not ok 11389 1726854863.08363: done checking to see if all hosts have failed 11389 1726854863.08363: getting the remaining hosts for this loop 11389 1726854863.08364: done getting the remaining hosts for this loop 11389 1726854863.08367: getting the next task for host managed_node3 11389 1726854863.08374: done getting next task for host managed_node3 11389 1726854863.08377: ^ task is: TASK: Get stat for interface {{ interface }} 11389 1726854863.08380: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854863.08383: getting variables 11389 1726854863.08384: in VariableManager get_vars() 11389 1726854863.08401: Calling all_inventory to load vars for managed_node3 11389 1726854863.08404: Calling groups_inventory to load vars for managed_node3 11389 1726854863.08406: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854863.08411: Calling all_plugins_play to load vars for managed_node3 11389 1726854863.08413: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854863.08415: Calling groups_plugins_play to load vars for managed_node3 11389 1726854863.09656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854863.11740: done with get_vars() 11389 1726854863.11762: done getting variables 11389 1726854863.11935: variable 'interface' from source: task vars 11389 1726854863.11939: variable 'controller_device' from source: play vars 11389 1726854863.11999: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:54:23 -0400 (0:00:00.095) 0:00:15.543 ****** 11389 1726854863.12035: entering _queue_task() for managed_node3/stat 11389 1726854863.12374: worker is 1 (out of 1 available) 11389 1726854863.12384: exiting _queue_task() for managed_node3/stat 11389 1726854863.12397: done queuing things up, now waiting for results queue to drain 11389 1726854863.12399: waiting for pending results... 11389 1726854863.12648: running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond 11389 1726854863.12777: in run() - task 0affcc66-ac2b-deb8-c119-000000000241 11389 1726854863.12799: variable 'ansible_search_path' from source: unknown 11389 1726854863.12808: variable 'ansible_search_path' from source: unknown 11389 1726854863.12845: calling self._execute() 11389 1726854863.12935: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854863.12947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854863.12961: variable 'omit' from source: magic vars 11389 1726854863.13301: variable 'ansible_distribution_major_version' from source: facts 11389 1726854863.13393: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854863.13397: variable 'omit' from source: magic vars 11389 1726854863.13400: variable 'omit' from source: magic vars 11389 1726854863.13493: variable 'interface' from source: task vars 11389 1726854863.13505: variable 'controller_device' from source: play vars 11389 1726854863.13573: variable 'controller_device' from source: play vars 11389 1726854863.13598: variable 'omit' from source: magic vars 11389 1726854863.13643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854863.13683: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854863.13709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854863.13731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854863.13747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854863.13783: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854863.13792: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854863.13801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854863.13992: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854863.13995: Set connection var ansible_timeout to 10 11389 1726854863.13997: Set connection var ansible_connection to ssh 11389 1726854863.13999: Set connection var ansible_shell_type to sh 11389 1726854863.14001: Set connection var ansible_pipelining to False 11389 1726854863.14004: Set connection var ansible_shell_executable to /bin/sh 11389 1726854863.14006: variable 'ansible_shell_executable' from source: unknown 11389 1726854863.14008: variable 'ansible_connection' from source: unknown 11389 1726854863.14010: variable 'ansible_module_compression' from source: unknown 11389 1726854863.14012: variable 'ansible_shell_type' from source: unknown 11389 1726854863.14014: variable 'ansible_shell_executable' from source: unknown 11389 1726854863.14016: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854863.14018: variable 'ansible_pipelining' from source: unknown 11389 1726854863.14020: variable 'ansible_timeout' from source: unknown 11389 1726854863.14022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854863.14246: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11389 1726854863.14251: variable 'omit' from source: magic vars 11389 1726854863.14253: starting attempt loop 11389 1726854863.14256: running the handler 11389 1726854863.14259: _low_level_execute_command(): starting 11389 1726854863.14272: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854863.15011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854863.15026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 11389 1726854863.15122: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854863.15144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854863.15161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854863.15268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854863.16952: stdout chunk (state=3): >>>/root <<< 11389 1726854863.17089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854863.17101: stdout chunk (state=3): >>><<< 11389 1726854863.17113: stderr chunk (state=3): >>><<< 11389 1726854863.17237: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854863.17241: _low_level_execute_command(): starting 11389 1726854863.17244: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854863.1714056-12183-220794007809554 `" && echo ansible-tmp-1726854863.1714056-12183-220794007809554="` echo /root/.ansible/tmp/ansible-tmp-1726854863.1714056-12183-220794007809554 `" ) && sleep 0' 11389 1726854863.17825: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854863.17850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854863.17867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854863.17902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854863.17933: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11389 1726854863.18004: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854863.18043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854863.18059: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854863.18081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854863.18175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854863.20498: stdout chunk (state=3): >>>ansible-tmp-1726854863.1714056-12183-220794007809554=/root/.ansible/tmp/ansible-tmp-1726854863.1714056-12183-220794007809554 <<< 11389 1726854863.20502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854863.20505: stdout chunk (state=3): >>><<< 11389 1726854863.20507: stderr chunk (state=3): >>><<< 11389 1726854863.20510: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854863.1714056-12183-220794007809554=/root/.ansible/tmp/ansible-tmp-1726854863.1714056-12183-220794007809554 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854863.20513: variable 'ansible_module_compression' from source: unknown 11389 1726854863.20606: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11389 1726854863.20645: variable 'ansible_facts' from source: unknown 11389 1726854863.20870: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854863.1714056-12183-220794007809554/AnsiballZ_stat.py 11389 1726854863.21114: Sending initial data 11389 1726854863.21117: Sent initial data (153 bytes) 11389 1726854863.21671: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854863.21703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854863.21805: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854863.21817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854863.21832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854863.21918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854863.23612: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854863.23636: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854863.23704: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmpghv6h0ui /root/.ansible/tmp/ansible-tmp-1726854863.1714056-12183-220794007809554/AnsiballZ_stat.py <<< 11389 1726854863.23904: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854863.1714056-12183-220794007809554/AnsiballZ_stat.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmpghv6h0ui" to remote "/root/.ansible/tmp/ansible-tmp-1726854863.1714056-12183-220794007809554/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854863.1714056-12183-220794007809554/AnsiballZ_stat.py" <<< 11389 1726854863.25060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854863.25138: stderr chunk (state=3): >>><<< 11389 1726854863.25144: stdout chunk (state=3): >>><<< 11389 1726854863.25173: done transferring module to remote 11389 1726854863.25185: _low_level_execute_command(): starting 11389 1726854863.25191: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854863.1714056-12183-220794007809554/ /root/.ansible/tmp/ansible-tmp-1726854863.1714056-12183-220794007809554/AnsiballZ_stat.py && sleep 0' 11389 1726854863.26602: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854863.26712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854863.29017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854863.29021: stdout chunk (state=3): >>><<< 11389 1726854863.29024: stderr chunk (state=3): >>><<< 11389 1726854863.29026: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854863.29028: _low_level_execute_command(): starting 11389 1726854863.29031: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854863.1714056-12183-220794007809554/AnsiballZ_stat.py && sleep 0' 11389 1726854863.30118: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854863.30122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854863.30201: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854863.30213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854863.30382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854863.30396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854863.30445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854863.30543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854863.45539: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28290, "dev": 23, "nlink": 1, "atime": 1726854862.1162987, "mtime": 1726854862.1162987, "ctime": 1726854862.1162987, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11389 1726854863.46750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854863.46780: stderr chunk (state=3): >>><<< 11389 1726854863.46784: stdout chunk (state=3): >>><<< 11389 1726854863.46801: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28290, "dev": 23, "nlink": 1, "atime": 1726854862.1162987, "mtime": 1726854862.1162987, "ctime": 1726854862.1162987, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854863.46837: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854863.1714056-12183-220794007809554/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854863.46845: _low_level_execute_command(): starting 11389 1726854863.46849: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854863.1714056-12183-220794007809554/ > /dev/null 2>&1 && sleep 0' 11389 1726854863.47282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854863.47292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854863.47295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 11389 1726854863.47297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854863.47343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854863.47346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854863.47446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854863.49305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854863.49327: stderr chunk (state=3): >>><<< 11389 1726854863.49330: stdout chunk (state=3): >>><<< 11389 1726854863.49344: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854863.49350: handler run complete 11389 1726854863.49384: attempt loop complete, returning result 11389 1726854863.49389: _execute() done 11389 1726854863.49391: dumping result to json 11389 1726854863.49396: done dumping result, returning 11389 1726854863.49403: done running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond [0affcc66-ac2b-deb8-c119-000000000241] 11389 1726854863.49409: sending task result for task 0affcc66-ac2b-deb8-c119-000000000241 11389 1726854863.49507: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000241 11389 1726854863.49510: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726854862.1162987, "block_size": 4096, "blocks": 0, "ctime": 1726854862.1162987, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28290, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1726854862.1162987, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11389 1726854863.49603: no more pending results, returning what we have 11389 1726854863.49607: results queue empty 11389 1726854863.49608: checking for any_errors_fatal 11389 1726854863.49609: done checking for any_errors_fatal 11389 1726854863.49610: checking for max_fail_percentage 11389 1726854863.49612: done checking for max_fail_percentage 11389 1726854863.49612: checking to see if all hosts have failed and the running result is not ok 11389 1726854863.49613: done checking to see if all hosts have failed 11389 1726854863.49614: getting the remaining hosts for this loop 11389 1726854863.49615: done getting the remaining hosts for this loop 11389 1726854863.49619: getting the next task for host managed_node3 11389 1726854863.49627: done getting next task for host managed_node3 11389 1726854863.49629: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11389 1726854863.49632: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854863.49635: getting variables 11389 1726854863.49637: in VariableManager get_vars() 11389 1726854863.49672: Calling all_inventory to load vars for managed_node3 11389 1726854863.49674: Calling groups_inventory to load vars for managed_node3 11389 1726854863.49676: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854863.49685: Calling all_plugins_play to load vars for managed_node3 11389 1726854863.49695: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854863.49699: Calling groups_plugins_play to load vars for managed_node3 11389 1726854863.50566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854863.52035: done with get_vars() 11389 1726854863.52056: done getting variables 11389 1726854863.52104: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854863.52193: variable 'interface' from source: task vars 11389 1726854863.52196: variable 'controller_device' from source: play vars 11389 1726854863.52238: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:54:23 -0400 (0:00:00.402) 0:00:15.945 ****** 11389 1726854863.52263: entering _queue_task() for managed_node3/assert 11389 1726854863.52509: worker is 1 (out of 1 available) 11389 1726854863.52522: exiting _queue_task() for managed_node3/assert 11389 1726854863.52534: done queuing things up, now waiting for results queue to drain 11389 1726854863.52536: waiting for pending results... 11389 1726854863.52709: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' 11389 1726854863.52788: in run() - task 0affcc66-ac2b-deb8-c119-00000000006f 11389 1726854863.52798: variable 'ansible_search_path' from source: unknown 11389 1726854863.52801: variable 'ansible_search_path' from source: unknown 11389 1726854863.52829: calling self._execute() 11389 1726854863.52906: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854863.52911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854863.52919: variable 'omit' from source: magic vars 11389 1726854863.53165: variable 'ansible_distribution_major_version' from source: facts 11389 1726854863.53175: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854863.53181: variable 'omit' from source: magic vars 11389 1726854863.53221: variable 'omit' from source: magic vars 11389 1726854863.53288: variable 'interface' from source: task vars 11389 1726854863.53292: variable 'controller_device' from source: play vars 11389 1726854863.53392: variable 'controller_device' from source: play vars 11389 1726854863.53396: variable 'omit' from source: magic vars 11389 1726854863.53407: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854863.53464: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854863.53470: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854863.53483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854863.53556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854863.53559: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854863.53562: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854863.53564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854863.53634: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854863.53648: Set connection var ansible_timeout to 10 11389 1726854863.53651: Set connection var ansible_connection to ssh 11389 1726854863.53653: Set connection var ansible_shell_type to sh 11389 1726854863.53659: Set connection var ansible_pipelining to False 11389 1726854863.53694: Set connection var ansible_shell_executable to /bin/sh 11389 1726854863.53701: variable 'ansible_shell_executable' from source: unknown 11389 1726854863.53708: variable 'ansible_connection' from source: unknown 11389 1726854863.53713: variable 'ansible_module_compression' from source: unknown 11389 1726854863.53717: variable 'ansible_shell_type' from source: unknown 11389 1726854863.53723: variable 'ansible_shell_executable' from source: unknown 11389 1726854863.53728: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854863.53893: variable 'ansible_pipelining' from source: unknown 11389 1726854863.53897: variable 'ansible_timeout' from source: unknown 11389 1726854863.53899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854863.53901: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854863.53904: variable 'omit' from source: magic vars 11389 1726854863.53905: starting attempt loop 11389 1726854863.53908: running the handler 11389 1726854863.54036: variable 'interface_stat' from source: set_fact 11389 1726854863.54059: Evaluated conditional (interface_stat.stat.exists): True 11389 1726854863.54093: handler run complete 11389 1726854863.54114: attempt loop complete, returning result 11389 1726854863.54123: _execute() done 11389 1726854863.54129: dumping result to json 11389 1726854863.54136: done dumping result, returning 11389 1726854863.54151: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' [0affcc66-ac2b-deb8-c119-00000000006f] 11389 1726854863.54161: sending task result for task 0affcc66-ac2b-deb8-c119-00000000006f 11389 1726854863.54259: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000006f 11389 1726854863.54262: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11389 1726854863.54344: no more pending results, returning what we have 11389 1726854863.54348: results queue empty 11389 1726854863.54349: checking for any_errors_fatal 11389 1726854863.54357: done checking for any_errors_fatal 11389 1726854863.54358: checking for max_fail_percentage 11389 1726854863.54360: done checking for max_fail_percentage 11389 1726854863.54361: checking to see if all hosts have failed and the running result is not ok 11389 1726854863.54362: done checking to see if all hosts have failed 11389 1726854863.54362: getting the remaining hosts for this loop 11389 1726854863.54364: done getting the remaining hosts for this loop 11389 1726854863.54366: getting the next task for host managed_node3 11389 1726854863.54373: done getting next task for host managed_node3 11389 1726854863.54376: ^ task is: TASK: Include the task 'assert_profile_present.yml' 11389 1726854863.54378: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854863.54381: getting variables 11389 1726854863.54382: in VariableManager get_vars() 11389 1726854863.54424: Calling all_inventory to load vars for managed_node3 11389 1726854863.54426: Calling groups_inventory to load vars for managed_node3 11389 1726854863.54428: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854863.54436: Calling all_plugins_play to load vars for managed_node3 11389 1726854863.54439: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854863.54441: Calling groups_plugins_play to load vars for managed_node3 11389 1726854863.55789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854863.56652: done with get_vars() 11389 1726854863.56671: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:67 Friday 20 September 2024 13:54:23 -0400 (0:00:00.044) 0:00:15.990 ****** 11389 1726854863.56740: entering _queue_task() for managed_node3/include_tasks 11389 1726854863.56972: worker is 1 (out of 1 available) 11389 1726854863.56984: exiting _queue_task() for managed_node3/include_tasks 11389 1726854863.56997: done queuing things up, now waiting for results queue to drain 11389 1726854863.56999: waiting for pending results... 11389 1726854863.57172: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 11389 1726854863.57242: in run() - task 0affcc66-ac2b-deb8-c119-000000000070 11389 1726854863.57250: variable 'ansible_search_path' from source: unknown 11389 1726854863.57292: variable 'controller_profile' from source: play vars 11389 1726854863.57515: variable 'controller_profile' from source: play vars 11389 1726854863.57519: variable 'port1_profile' from source: play vars 11389 1726854863.57546: variable 'port1_profile' from source: play vars 11389 1726854863.57557: variable 'port2_profile' from source: play vars 11389 1726854863.57686: variable 'port2_profile' from source: play vars 11389 1726854863.57693: variable 'omit' from source: magic vars 11389 1726854863.57892: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854863.57896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854863.57899: variable 'omit' from source: magic vars 11389 1726854863.58045: variable 'ansible_distribution_major_version' from source: facts 11389 1726854863.58060: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854863.58096: variable 'item' from source: unknown 11389 1726854863.58167: variable 'item' from source: unknown 11389 1726854863.58413: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854863.58416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854863.58419: variable 'omit' from source: magic vars 11389 1726854863.58586: variable 'ansible_distribution_major_version' from source: facts 11389 1726854863.58594: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854863.58614: variable 'item' from source: unknown 11389 1726854863.58659: variable 'item' from source: unknown 11389 1726854863.58730: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854863.58733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854863.58742: variable 'omit' from source: magic vars 11389 1726854863.58849: variable 'ansible_distribution_major_version' from source: facts 11389 1726854863.58852: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854863.58872: variable 'item' from source: unknown 11389 1726854863.58918: variable 'item' from source: unknown 11389 1726854863.58981: dumping result to json 11389 1726854863.58984: done dumping result, returning 11389 1726854863.58990: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [0affcc66-ac2b-deb8-c119-000000000070] 11389 1726854863.58992: sending task result for task 0affcc66-ac2b-deb8-c119-000000000070 11389 1726854863.59025: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000070 11389 1726854863.59028: WORKER PROCESS EXITING 11389 1726854863.59059: no more pending results, returning what we have 11389 1726854863.59064: in VariableManager get_vars() 11389 1726854863.59109: Calling all_inventory to load vars for managed_node3 11389 1726854863.59111: Calling groups_inventory to load vars for managed_node3 11389 1726854863.59113: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854863.59127: Calling all_plugins_play to load vars for managed_node3 11389 1726854863.59130: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854863.59132: Calling groups_plugins_play to load vars for managed_node3 11389 1726854863.59929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854863.60777: done with get_vars() 11389 1726854863.60792: variable 'ansible_search_path' from source: unknown 11389 1726854863.60804: variable 'ansible_search_path' from source: unknown 11389 1726854863.60810: variable 'ansible_search_path' from source: unknown 11389 1726854863.60814: we have included files to process 11389 1726854863.60814: generating all_blocks data 11389 1726854863.60815: done generating all_blocks data 11389 1726854863.60818: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11389 1726854863.60819: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11389 1726854863.60820: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11389 1726854863.60961: in VariableManager get_vars() 11389 1726854863.60977: done with get_vars() 11389 1726854863.61146: done processing included file 11389 1726854863.61148: iterating over new_blocks loaded from include file 11389 1726854863.61148: in VariableManager get_vars() 11389 1726854863.61159: done with get_vars() 11389 1726854863.61160: filtering new block on tags 11389 1726854863.61174: done filtering new block on tags 11389 1726854863.61175: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0) 11389 1726854863.61178: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11389 1726854863.61179: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11389 1726854863.61181: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11389 1726854863.61242: in VariableManager get_vars() 11389 1726854863.61255: done with get_vars() 11389 1726854863.61405: done processing included file 11389 1726854863.61406: iterating over new_blocks loaded from include file 11389 1726854863.61407: in VariableManager get_vars() 11389 1726854863.61417: done with get_vars() 11389 1726854863.61418: filtering new block on tags 11389 1726854863.61431: done filtering new block on tags 11389 1726854863.61433: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.0) 11389 1726854863.61435: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11389 1726854863.61436: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11389 1726854863.61438: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11389 1726854863.61498: in VariableManager get_vars() 11389 1726854863.61546: done with get_vars() 11389 1726854863.61693: done processing included file 11389 1726854863.61695: iterating over new_blocks loaded from include file 11389 1726854863.61695: in VariableManager get_vars() 11389 1726854863.61706: done with get_vars() 11389 1726854863.61708: filtering new block on tags 11389 1726854863.61718: done filtering new block on tags 11389 1726854863.61720: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.1) 11389 1726854863.61722: extending task lists for all hosts with included blocks 11389 1726854863.63908: done extending task lists 11389 1726854863.63915: done processing included files 11389 1726854863.63916: results queue empty 11389 1726854863.63917: checking for any_errors_fatal 11389 1726854863.63920: done checking for any_errors_fatal 11389 1726854863.63921: checking for max_fail_percentage 11389 1726854863.63922: done checking for max_fail_percentage 11389 1726854863.63922: checking to see if all hosts have failed and the running result is not ok 11389 1726854863.63923: done checking to see if all hosts have failed 11389 1726854863.63924: getting the remaining hosts for this loop 11389 1726854863.63924: done getting the remaining hosts for this loop 11389 1726854863.63926: getting the next task for host managed_node3 11389 1726854863.63929: done getting next task for host managed_node3 11389 1726854863.63931: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11389 1726854863.63932: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854863.63934: getting variables 11389 1726854863.63935: in VariableManager get_vars() 11389 1726854863.63947: Calling all_inventory to load vars for managed_node3 11389 1726854863.63949: Calling groups_inventory to load vars for managed_node3 11389 1726854863.63951: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854863.63957: Calling all_plugins_play to load vars for managed_node3 11389 1726854863.63958: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854863.63960: Calling groups_plugins_play to load vars for managed_node3 11389 1726854863.68196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854863.69673: done with get_vars() 11389 1726854863.69704: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:54:23 -0400 (0:00:00.130) 0:00:16.120 ****** 11389 1726854863.69781: entering _queue_task() for managed_node3/include_tasks 11389 1726854863.70135: worker is 1 (out of 1 available) 11389 1726854863.70149: exiting _queue_task() for managed_node3/include_tasks 11389 1726854863.70160: done queuing things up, now waiting for results queue to drain 11389 1726854863.70162: waiting for pending results... 11389 1726854863.70607: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 11389 1726854863.70611: in run() - task 0affcc66-ac2b-deb8-c119-00000000025f 11389 1726854863.70613: variable 'ansible_search_path' from source: unknown 11389 1726854863.70615: variable 'ansible_search_path' from source: unknown 11389 1726854863.70618: calling self._execute() 11389 1726854863.70699: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854863.70793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854863.70796: variable 'omit' from source: magic vars 11389 1726854863.71098: variable 'ansible_distribution_major_version' from source: facts 11389 1726854863.71113: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854863.71124: _execute() done 11389 1726854863.71131: dumping result to json 11389 1726854863.71145: done dumping result, returning 11389 1726854863.71155: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affcc66-ac2b-deb8-c119-00000000025f] 11389 1726854863.71165: sending task result for task 0affcc66-ac2b-deb8-c119-00000000025f 11389 1726854863.71522: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000025f 11389 1726854863.71526: WORKER PROCESS EXITING 11389 1726854863.71553: no more pending results, returning what we have 11389 1726854863.71557: in VariableManager get_vars() 11389 1726854863.71601: Calling all_inventory to load vars for managed_node3 11389 1726854863.71604: Calling groups_inventory to load vars for managed_node3 11389 1726854863.71606: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854863.71617: Calling all_plugins_play to load vars for managed_node3 11389 1726854863.71620: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854863.71623: Calling groups_plugins_play to load vars for managed_node3 11389 1726854863.72939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854863.74601: done with get_vars() 11389 1726854863.74621: variable 'ansible_search_path' from source: unknown 11389 1726854863.74623: variable 'ansible_search_path' from source: unknown 11389 1726854863.74661: we have included files to process 11389 1726854863.74662: generating all_blocks data 11389 1726854863.74664: done generating all_blocks data 11389 1726854863.74666: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11389 1726854863.74667: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11389 1726854863.74669: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11389 1726854863.76111: done processing included file 11389 1726854863.76114: iterating over new_blocks loaded from include file 11389 1726854863.76116: in VariableManager get_vars() 11389 1726854863.76140: done with get_vars() 11389 1726854863.76142: filtering new block on tags 11389 1726854863.76167: done filtering new block on tags 11389 1726854863.76170: in VariableManager get_vars() 11389 1726854863.76295: done with get_vars() 11389 1726854863.76297: filtering new block on tags 11389 1726854863.76321: done filtering new block on tags 11389 1726854863.76324: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 11389 1726854863.76329: extending task lists for all hosts with included blocks 11389 1726854863.76846: done extending task lists 11389 1726854863.76848: done processing included files 11389 1726854863.76849: results queue empty 11389 1726854863.76849: checking for any_errors_fatal 11389 1726854863.76853: done checking for any_errors_fatal 11389 1726854863.76854: checking for max_fail_percentage 11389 1726854863.76855: done checking for max_fail_percentage 11389 1726854863.76856: checking to see if all hosts have failed and the running result is not ok 11389 1726854863.76857: done checking to see if all hosts have failed 11389 1726854863.76857: getting the remaining hosts for this loop 11389 1726854863.76858: done getting the remaining hosts for this loop 11389 1726854863.76861: getting the next task for host managed_node3 11389 1726854863.76866: done getting next task for host managed_node3 11389 1726854863.76868: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11389 1726854863.76870: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854863.76873: getting variables 11389 1726854863.76874: in VariableManager get_vars() 11389 1726854863.76888: Calling all_inventory to load vars for managed_node3 11389 1726854863.76891: Calling groups_inventory to load vars for managed_node3 11389 1726854863.76893: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854863.76899: Calling all_plugins_play to load vars for managed_node3 11389 1726854863.76901: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854863.76904: Calling groups_plugins_play to load vars for managed_node3 11389 1726854863.78029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854863.79091: done with get_vars() 11389 1726854863.79110: done getting variables 11389 1726854863.79143: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:54:23 -0400 (0:00:00.093) 0:00:16.214 ****** 11389 1726854863.79168: entering _queue_task() for managed_node3/set_fact 11389 1726854863.79736: worker is 1 (out of 1 available) 11389 1726854863.79748: exiting _queue_task() for managed_node3/set_fact 11389 1726854863.79758: done queuing things up, now waiting for results queue to drain 11389 1726854863.79760: waiting for pending results... 11389 1726854863.80086: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 11389 1726854863.80260: in run() - task 0affcc66-ac2b-deb8-c119-0000000003b0 11389 1726854863.80289: variable 'ansible_search_path' from source: unknown 11389 1726854863.80308: variable 'ansible_search_path' from source: unknown 11389 1726854863.80349: calling self._execute() 11389 1726854863.80447: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854863.80459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854863.80476: variable 'omit' from source: magic vars 11389 1726854863.80961: variable 'ansible_distribution_major_version' from source: facts 11389 1726854863.81198: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854863.81202: variable 'omit' from source: magic vars 11389 1726854863.81205: variable 'omit' from source: magic vars 11389 1726854863.81207: variable 'omit' from source: magic vars 11389 1726854863.81253: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854863.81692: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854863.81696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854863.81698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854863.81701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854863.81703: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854863.81709: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854863.81712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854863.81836: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854863.81848: Set connection var ansible_timeout to 10 11389 1726854863.81855: Set connection var ansible_connection to ssh 11389 1726854863.81864: Set connection var ansible_shell_type to sh 11389 1726854863.81876: Set connection var ansible_pipelining to False 11389 1726854863.81885: Set connection var ansible_shell_executable to /bin/sh 11389 1726854863.81912: variable 'ansible_shell_executable' from source: unknown 11389 1726854863.81955: variable 'ansible_connection' from source: unknown 11389 1726854863.82192: variable 'ansible_module_compression' from source: unknown 11389 1726854863.82196: variable 'ansible_shell_type' from source: unknown 11389 1726854863.82198: variable 'ansible_shell_executable' from source: unknown 11389 1726854863.82200: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854863.82202: variable 'ansible_pipelining' from source: unknown 11389 1726854863.82204: variable 'ansible_timeout' from source: unknown 11389 1726854863.82206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854863.82289: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854863.82342: variable 'omit' from source: magic vars 11389 1726854863.82351: starting attempt loop 11389 1726854863.82360: running the handler 11389 1726854863.82380: handler run complete 11389 1726854863.82396: attempt loop complete, returning result 11389 1726854863.82403: _execute() done 11389 1726854863.82409: dumping result to json 11389 1726854863.82415: done dumping result, returning 11389 1726854863.82425: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcc66-ac2b-deb8-c119-0000000003b0] 11389 1726854863.82433: sending task result for task 0affcc66-ac2b-deb8-c119-0000000003b0 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11389 1726854863.82579: no more pending results, returning what we have 11389 1726854863.82583: results queue empty 11389 1726854863.82583: checking for any_errors_fatal 11389 1726854863.82585: done checking for any_errors_fatal 11389 1726854863.82585: checking for max_fail_percentage 11389 1726854863.82589: done checking for max_fail_percentage 11389 1726854863.82589: checking to see if all hosts have failed and the running result is not ok 11389 1726854863.82590: done checking to see if all hosts have failed 11389 1726854863.82591: getting the remaining hosts for this loop 11389 1726854863.82593: done getting the remaining hosts for this loop 11389 1726854863.82596: getting the next task for host managed_node3 11389 1726854863.82602: done getting next task for host managed_node3 11389 1726854863.82604: ^ task is: TASK: Stat profile file 11389 1726854863.82608: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854863.82612: getting variables 11389 1726854863.82614: in VariableManager get_vars() 11389 1726854863.82653: Calling all_inventory to load vars for managed_node3 11389 1726854863.82657: Calling groups_inventory to load vars for managed_node3 11389 1726854863.82659: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854863.82672: Calling all_plugins_play to load vars for managed_node3 11389 1726854863.82675: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854863.82678: Calling groups_plugins_play to load vars for managed_node3 11389 1726854863.83302: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000003b0 11389 1726854863.83306: WORKER PROCESS EXITING 11389 1726854863.84159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854863.87278: done with get_vars() 11389 1726854863.87310: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:54:23 -0400 (0:00:00.082) 0:00:16.297 ****** 11389 1726854863.87408: entering _queue_task() for managed_node3/stat 11389 1726854863.87751: worker is 1 (out of 1 available) 11389 1726854863.87764: exiting _queue_task() for managed_node3/stat 11389 1726854863.87779: done queuing things up, now waiting for results queue to drain 11389 1726854863.87781: waiting for pending results... 11389 1726854863.88073: running TaskExecutor() for managed_node3/TASK: Stat profile file 11389 1726854863.88117: in run() - task 0affcc66-ac2b-deb8-c119-0000000003b1 11389 1726854863.88136: variable 'ansible_search_path' from source: unknown 11389 1726854863.88143: variable 'ansible_search_path' from source: unknown 11389 1726854863.88189: calling self._execute() 11389 1726854863.88292: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854863.88304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854863.88319: variable 'omit' from source: magic vars 11389 1726854863.88711: variable 'ansible_distribution_major_version' from source: facts 11389 1726854863.88758: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854863.88830: variable 'omit' from source: magic vars 11389 1726854863.88835: variable 'omit' from source: magic vars 11389 1726854863.89042: variable 'profile' from source: include params 11389 1726854863.89053: variable 'item' from source: include params 11389 1726854863.89163: variable 'item' from source: include params 11389 1726854863.89185: variable 'omit' from source: magic vars 11389 1726854863.89395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854863.89399: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854863.89586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854863.89592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854863.89594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854863.89597: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854863.89600: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854863.89602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854863.89825: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854863.90021: Set connection var ansible_timeout to 10 11389 1726854863.90024: Set connection var ansible_connection to ssh 11389 1726854863.90027: Set connection var ansible_shell_type to sh 11389 1726854863.90029: Set connection var ansible_pipelining to False 11389 1726854863.90031: Set connection var ansible_shell_executable to /bin/sh 11389 1726854863.90033: variable 'ansible_shell_executable' from source: unknown 11389 1726854863.90037: variable 'ansible_connection' from source: unknown 11389 1726854863.90039: variable 'ansible_module_compression' from source: unknown 11389 1726854863.90041: variable 'ansible_shell_type' from source: unknown 11389 1726854863.90044: variable 'ansible_shell_executable' from source: unknown 11389 1726854863.90045: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854863.90048: variable 'ansible_pipelining' from source: unknown 11389 1726854863.90050: variable 'ansible_timeout' from source: unknown 11389 1726854863.90052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854863.90463: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11389 1726854863.90467: variable 'omit' from source: magic vars 11389 1726854863.90472: starting attempt loop 11389 1726854863.90474: running the handler 11389 1726854863.90484: _low_level_execute_command(): starting 11389 1726854863.90501: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854863.92128: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854863.92276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854863.92367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854863.92457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854863.94114: stdout chunk (state=3): >>>/root <<< 11389 1726854863.94356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854863.94533: stderr chunk (state=3): >>><<< 11389 1726854863.94536: stdout chunk (state=3): >>><<< 11389 1726854863.94540: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854863.94557: _low_level_execute_command(): starting 11389 1726854863.94576: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854863.945434-12226-109712464681052 `" && echo ansible-tmp-1726854863.945434-12226-109712464681052="` echo /root/.ansible/tmp/ansible-tmp-1726854863.945434-12226-109712464681052 `" ) && sleep 0' 11389 1726854863.95639: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854863.95642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854863.95645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854863.95648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854863.95650: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854863.95678: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854863.95683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854863.95818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854863.97790: stdout chunk (state=3): >>>ansible-tmp-1726854863.945434-12226-109712464681052=/root/.ansible/tmp/ansible-tmp-1726854863.945434-12226-109712464681052 <<< 11389 1726854863.97995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854863.97998: stdout chunk (state=3): >>><<< 11389 1726854863.98001: stderr chunk (state=3): >>><<< 11389 1726854863.98003: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854863.945434-12226-109712464681052=/root/.ansible/tmp/ansible-tmp-1726854863.945434-12226-109712464681052 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854863.98016: variable 'ansible_module_compression' from source: unknown 11389 1726854863.98074: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11389 1726854863.98122: variable 'ansible_facts' from source: unknown 11389 1726854863.98226: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854863.945434-12226-109712464681052/AnsiballZ_stat.py 11389 1726854863.98477: Sending initial data 11389 1726854863.98480: Sent initial data (152 bytes) 11389 1726854863.98896: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854863.98902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854863.98932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854863.98935: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854863.98938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854863.98992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854863.98995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854863.99061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854864.00700: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854864.00735: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854864.00794: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmpgydzp_7i /root/.ansible/tmp/ansible-tmp-1726854863.945434-12226-109712464681052/AnsiballZ_stat.py <<< 11389 1726854864.00803: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854863.945434-12226-109712464681052/AnsiballZ_stat.py" <<< 11389 1726854864.00855: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmpgydzp_7i" to remote "/root/.ansible/tmp/ansible-tmp-1726854863.945434-12226-109712464681052/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854863.945434-12226-109712464681052/AnsiballZ_stat.py" <<< 11389 1726854864.01747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854864.01750: stdout chunk (state=3): >>><<< 11389 1726854864.01753: stderr chunk (state=3): >>><<< 11389 1726854864.01755: done transferring module to remote 11389 1726854864.01800: _low_level_execute_command(): starting 11389 1726854864.01812: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854863.945434-12226-109712464681052/ /root/.ansible/tmp/ansible-tmp-1726854863.945434-12226-109712464681052/AnsiballZ_stat.py && sleep 0' 11389 1726854864.02334: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854864.02337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854864.02340: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854864.02342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854864.02345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854864.02399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854864.02402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854864.02405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854864.02473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854864.04332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854864.04336: stdout chunk (state=3): >>><<< 11389 1726854864.04339: stderr chunk (state=3): >>><<< 11389 1726854864.04370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854864.04374: _low_level_execute_command(): starting 11389 1726854864.04385: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854863.945434-12226-109712464681052/AnsiballZ_stat.py && sleep 0' 11389 1726854864.04852: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854864.04855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854864.04858: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854864.04860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 11389 1726854864.04862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854864.04913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854864.04916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854864.04991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854864.20336: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11389 1726854864.21811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854864.21815: stdout chunk (state=3): >>><<< 11389 1726854864.21818: stderr chunk (state=3): >>><<< 11389 1726854864.21820: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854864.21823: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854863.945434-12226-109712464681052/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854864.21826: _low_level_execute_command(): starting 11389 1726854864.21828: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854863.945434-12226-109712464681052/ > /dev/null 2>&1 && sleep 0' 11389 1726854864.22429: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854864.22437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854864.22456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854864.22501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854864.22570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854864.22596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854864.22614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854864.22729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854864.24704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854864.24708: stdout chunk (state=3): >>><<< 11389 1726854864.24710: stderr chunk (state=3): >>><<< 11389 1726854864.24810: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854864.24813: handler run complete 11389 1726854864.24835: attempt loop complete, returning result 11389 1726854864.24838: _execute() done 11389 1726854864.24840: dumping result to json 11389 1726854864.24843: done dumping result, returning 11389 1726854864.24852: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affcc66-ac2b-deb8-c119-0000000003b1] 11389 1726854864.24856: sending task result for task 0affcc66-ac2b-deb8-c119-0000000003b1 ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 11389 1726854864.25021: no more pending results, returning what we have 11389 1726854864.25025: results queue empty 11389 1726854864.25026: checking for any_errors_fatal 11389 1726854864.25031: done checking for any_errors_fatal 11389 1726854864.25032: checking for max_fail_percentage 11389 1726854864.25033: done checking for max_fail_percentage 11389 1726854864.25034: checking to see if all hosts have failed and the running result is not ok 11389 1726854864.25035: done checking to see if all hosts have failed 11389 1726854864.25036: getting the remaining hosts for this loop 11389 1726854864.25037: done getting the remaining hosts for this loop 11389 1726854864.25040: getting the next task for host managed_node3 11389 1726854864.25047: done getting next task for host managed_node3 11389 1726854864.25049: ^ task is: TASK: Set NM profile exist flag based on the profile files 11389 1726854864.25053: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854864.25058: getting variables 11389 1726854864.25059: in VariableManager get_vars() 11389 1726854864.25102: Calling all_inventory to load vars for managed_node3 11389 1726854864.25105: Calling groups_inventory to load vars for managed_node3 11389 1726854864.25107: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854864.25119: Calling all_plugins_play to load vars for managed_node3 11389 1726854864.25122: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854864.25125: Calling groups_plugins_play to load vars for managed_node3 11389 1726854864.25721: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000003b1 11389 1726854864.25724: WORKER PROCESS EXITING 11389 1726854864.27329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854864.30812: done with get_vars() 11389 1726854864.30844: done getting variables 11389 1726854864.31116: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:54:24 -0400 (0:00:00.437) 0:00:16.734 ****** 11389 1726854864.31150: entering _queue_task() for managed_node3/set_fact 11389 1726854864.31724: worker is 1 (out of 1 available) 11389 1726854864.31737: exiting _queue_task() for managed_node3/set_fact 11389 1726854864.31750: done queuing things up, now waiting for results queue to drain 11389 1726854864.31751: waiting for pending results... 11389 1726854864.32258: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 11389 1726854864.32613: in run() - task 0affcc66-ac2b-deb8-c119-0000000003b2 11389 1726854864.32634: variable 'ansible_search_path' from source: unknown 11389 1726854864.32643: variable 'ansible_search_path' from source: unknown 11389 1726854864.32690: calling self._execute() 11389 1726854864.32995: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854864.32998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854864.33002: variable 'omit' from source: magic vars 11389 1726854864.33685: variable 'ansible_distribution_major_version' from source: facts 11389 1726854864.33772: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854864.34017: variable 'profile_stat' from source: set_fact 11389 1726854864.34038: Evaluated conditional (profile_stat.stat.exists): False 11389 1726854864.34092: when evaluation is False, skipping this task 11389 1726854864.34101: _execute() done 11389 1726854864.34109: dumping result to json 11389 1726854864.34117: done dumping result, returning 11389 1726854864.34127: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affcc66-ac2b-deb8-c119-0000000003b2] 11389 1726854864.34137: sending task result for task 0affcc66-ac2b-deb8-c119-0000000003b2 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11389 1726854864.34346: no more pending results, returning what we have 11389 1726854864.34351: results queue empty 11389 1726854864.34352: checking for any_errors_fatal 11389 1726854864.34359: done checking for any_errors_fatal 11389 1726854864.34360: checking for max_fail_percentage 11389 1726854864.34362: done checking for max_fail_percentage 11389 1726854864.34363: checking to see if all hosts have failed and the running result is not ok 11389 1726854864.34364: done checking to see if all hosts have failed 11389 1726854864.34365: getting the remaining hosts for this loop 11389 1726854864.34366: done getting the remaining hosts for this loop 11389 1726854864.34372: getting the next task for host managed_node3 11389 1726854864.34379: done getting next task for host managed_node3 11389 1726854864.34382: ^ task is: TASK: Get NM profile info 11389 1726854864.34389: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854864.34395: getting variables 11389 1726854864.34397: in VariableManager get_vars() 11389 1726854864.34442: Calling all_inventory to load vars for managed_node3 11389 1726854864.34445: Calling groups_inventory to load vars for managed_node3 11389 1726854864.34448: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854864.34462: Calling all_plugins_play to load vars for managed_node3 11389 1726854864.34466: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854864.34472: Calling groups_plugins_play to load vars for managed_node3 11389 1726854864.35260: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000003b2 11389 1726854864.35264: WORKER PROCESS EXITING 11389 1726854864.36675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854864.39658: done with get_vars() 11389 1726854864.39897: done getting variables 11389 1726854864.39960: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:54:24 -0400 (0:00:00.088) 0:00:16.823 ****** 11389 1726854864.40000: entering _queue_task() for managed_node3/shell 11389 1726854864.40734: worker is 1 (out of 1 available) 11389 1726854864.40746: exiting _queue_task() for managed_node3/shell 11389 1726854864.40758: done queuing things up, now waiting for results queue to drain 11389 1726854864.40760: waiting for pending results... 11389 1726854864.41262: running TaskExecutor() for managed_node3/TASK: Get NM profile info 11389 1726854864.41436: in run() - task 0affcc66-ac2b-deb8-c119-0000000003b3 11389 1726854864.41450: variable 'ansible_search_path' from source: unknown 11389 1726854864.41454: variable 'ansible_search_path' from source: unknown 11389 1726854864.41491: calling self._execute() 11389 1726854864.41727: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854864.41731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854864.41822: variable 'omit' from source: magic vars 11389 1726854864.42591: variable 'ansible_distribution_major_version' from source: facts 11389 1726854864.42602: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854864.42610: variable 'omit' from source: magic vars 11389 1726854864.42774: variable 'omit' from source: magic vars 11389 1726854864.42935: variable 'profile' from source: include params 11389 1726854864.42940: variable 'item' from source: include params 11389 1726854864.43119: variable 'item' from source: include params 11389 1726854864.43138: variable 'omit' from source: magic vars 11389 1726854864.43384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854864.43993: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854864.43997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854864.43999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854864.44001: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854864.44004: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854864.44006: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854864.44008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854864.44218: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854864.44233: Set connection var ansible_timeout to 10 11389 1726854864.44319: Set connection var ansible_connection to ssh 11389 1726854864.44331: Set connection var ansible_shell_type to sh 11389 1726854864.44341: Set connection var ansible_pipelining to False 11389 1726854864.44402: Set connection var ansible_shell_executable to /bin/sh 11389 1726854864.44432: variable 'ansible_shell_executable' from source: unknown 11389 1726854864.44441: variable 'ansible_connection' from source: unknown 11389 1726854864.44448: variable 'ansible_module_compression' from source: unknown 11389 1726854864.44693: variable 'ansible_shell_type' from source: unknown 11389 1726854864.44696: variable 'ansible_shell_executable' from source: unknown 11389 1726854864.44699: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854864.44701: variable 'ansible_pipelining' from source: unknown 11389 1726854864.44704: variable 'ansible_timeout' from source: unknown 11389 1726854864.44706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854864.44770: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854864.45092: variable 'omit' from source: magic vars 11389 1726854864.45096: starting attempt loop 11389 1726854864.45099: running the handler 11389 1726854864.45101: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854864.45104: _low_level_execute_command(): starting 11389 1726854864.45106: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854864.46105: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854864.46133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854864.46147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854864.46165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854864.46235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854864.46291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854864.46318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854864.46340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854864.46558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854864.48208: stdout chunk (state=3): >>>/root <<< 11389 1726854864.48305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854864.48360: stderr chunk (state=3): >>><<< 11389 1726854864.48409: stdout chunk (state=3): >>><<< 11389 1726854864.48438: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854864.48454: _low_level_execute_command(): starting 11389 1726854864.48520: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854864.4844418-12259-198531160273948 `" && echo ansible-tmp-1726854864.4844418-12259-198531160273948="` echo /root/.ansible/tmp/ansible-tmp-1726854864.4844418-12259-198531160273948 `" ) && sleep 0' 11389 1726854864.49594: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854864.49714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854864.49731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854864.49904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854864.49950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854864.49969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854864.49995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854864.50117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854864.52402: stdout chunk (state=3): >>>ansible-tmp-1726854864.4844418-12259-198531160273948=/root/.ansible/tmp/ansible-tmp-1726854864.4844418-12259-198531160273948 <<< 11389 1726854864.52406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854864.52408: stdout chunk (state=3): >>><<< 11389 1726854864.52410: stderr chunk (state=3): >>><<< 11389 1726854864.52430: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854864.4844418-12259-198531160273948=/root/.ansible/tmp/ansible-tmp-1726854864.4844418-12259-198531160273948 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854864.52492: variable 'ansible_module_compression' from source: unknown 11389 1726854864.52743: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11389 1726854864.52746: variable 'ansible_facts' from source: unknown 11389 1726854864.52883: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854864.4844418-12259-198531160273948/AnsiballZ_command.py 11389 1726854864.53204: Sending initial data 11389 1726854864.53208: Sent initial data (156 bytes) 11389 1726854864.54280: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854864.54503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854864.54517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854864.54534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854864.54635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854864.56202: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11389 1726854864.56231: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854864.56295: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854864.56354: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmp5cym8p5a /root/.ansible/tmp/ansible-tmp-1726854864.4844418-12259-198531160273948/AnsiballZ_command.py <<< 11389 1726854864.56358: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854864.4844418-12259-198531160273948/AnsiballZ_command.py" <<< 11389 1726854864.56423: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmp5cym8p5a" to remote "/root/.ansible/tmp/ansible-tmp-1726854864.4844418-12259-198531160273948/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854864.4844418-12259-198531160273948/AnsiballZ_command.py" <<< 11389 1726854864.57299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854864.57469: stderr chunk (state=3): >>><<< 11389 1726854864.57473: stdout chunk (state=3): >>><<< 11389 1726854864.57475: done transferring module to remote 11389 1726854864.57477: _low_level_execute_command(): starting 11389 1726854864.57479: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854864.4844418-12259-198531160273948/ /root/.ansible/tmp/ansible-tmp-1726854864.4844418-12259-198531160273948/AnsiballZ_command.py && sleep 0' 11389 1726854864.58377: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854864.58403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854864.58419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854864.58517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854864.58542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854864.58564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854864.58580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854864.58674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854864.60508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854864.60536: stdout chunk (state=3): >>><<< 11389 1726854864.60552: stderr chunk (state=3): >>><<< 11389 1726854864.60577: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854864.60589: _low_level_execute_command(): starting 11389 1726854864.60603: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854864.4844418-12259-198531160273948/AnsiballZ_command.py && sleep 0' 11389 1726854864.61542: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854864.61584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854864.61604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854864.61634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854864.61752: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854864.61928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854864.61981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854864.79231: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 13:54:24.770935", "end": "2024-09-20 13:54:24.791314", "delta": "0:00:00.020379", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11389 1726854864.80804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854864.80819: stderr chunk (state=3): >>>Shared connection to 10.31.9.244 closed. <<< 11389 1726854864.80879: stderr chunk (state=3): >>><<< 11389 1726854864.81133: stdout chunk (state=3): >>><<< 11389 1726854864.81137: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 13:54:24.770935", "end": "2024-09-20 13:54:24.791314", "delta": "0:00:00.020379", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854864.81140: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854864.4844418-12259-198531160273948/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854864.81142: _low_level_execute_command(): starting 11389 1726854864.81145: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854864.4844418-12259-198531160273948/ > /dev/null 2>&1 && sleep 0' 11389 1726854864.81941: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854864.81955: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854864.81967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854864.82011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854864.82029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854864.82120: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854864.82141: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854864.82239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854864.84367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854864.84371: stdout chunk (state=3): >>><<< 11389 1726854864.84375: stderr chunk (state=3): >>><<< 11389 1726854864.84398: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854864.84411: handler run complete 11389 1726854864.84457: Evaluated conditional (False): False 11389 1726854864.84474: attempt loop complete, returning result 11389 1726854864.84699: _execute() done 11389 1726854864.84702: dumping result to json 11389 1726854864.84705: done dumping result, returning 11389 1726854864.84707: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affcc66-ac2b-deb8-c119-0000000003b3] 11389 1726854864.84709: sending task result for task 0affcc66-ac2b-deb8-c119-0000000003b3 11389 1726854864.84779: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000003b3 11389 1726854864.84784: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.020379", "end": "2024-09-20 13:54:24.791314", "rc": 0, "start": "2024-09-20 13:54:24.770935" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 11389 1726854864.84858: no more pending results, returning what we have 11389 1726854864.84861: results queue empty 11389 1726854864.84862: checking for any_errors_fatal 11389 1726854864.84869: done checking for any_errors_fatal 11389 1726854864.84870: checking for max_fail_percentage 11389 1726854864.84872: done checking for max_fail_percentage 11389 1726854864.84872: checking to see if all hosts have failed and the running result is not ok 11389 1726854864.84873: done checking to see if all hosts have failed 11389 1726854864.84874: getting the remaining hosts for this loop 11389 1726854864.84875: done getting the remaining hosts for this loop 11389 1726854864.84878: getting the next task for host managed_node3 11389 1726854864.84885: done getting next task for host managed_node3 11389 1726854864.84889: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11389 1726854864.84893: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854864.84898: getting variables 11389 1726854864.84899: in VariableManager get_vars() 11389 1726854864.84940: Calling all_inventory to load vars for managed_node3 11389 1726854864.84943: Calling groups_inventory to load vars for managed_node3 11389 1726854864.84946: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854864.85084: Calling all_plugins_play to load vars for managed_node3 11389 1726854864.85090: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854864.85094: Calling groups_plugins_play to load vars for managed_node3 11389 1726854864.87038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854864.88850: done with get_vars() 11389 1726854864.88875: done getting variables 11389 1726854864.88942: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:54:24 -0400 (0:00:00.489) 0:00:17.312 ****** 11389 1726854864.88976: entering _queue_task() for managed_node3/set_fact 11389 1726854864.89409: worker is 1 (out of 1 available) 11389 1726854864.89421: exiting _queue_task() for managed_node3/set_fact 11389 1726854864.89544: done queuing things up, now waiting for results queue to drain 11389 1726854864.89546: waiting for pending results... 11389 1726854864.89813: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11389 1726854864.89818: in run() - task 0affcc66-ac2b-deb8-c119-0000000003b4 11389 1726854864.89894: variable 'ansible_search_path' from source: unknown 11389 1726854864.89897: variable 'ansible_search_path' from source: unknown 11389 1726854864.89901: calling self._execute() 11389 1726854864.89999: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854864.90005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854864.90017: variable 'omit' from source: magic vars 11389 1726854864.90506: variable 'ansible_distribution_major_version' from source: facts 11389 1726854864.90517: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854864.90659: variable 'nm_profile_exists' from source: set_fact 11389 1726854864.90686: Evaluated conditional (nm_profile_exists.rc == 0): True 11389 1726854864.90693: variable 'omit' from source: magic vars 11389 1726854864.90740: variable 'omit' from source: magic vars 11389 1726854864.90782: variable 'omit' from source: magic vars 11389 1726854864.90847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854864.90903: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854864.90923: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854864.90941: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854864.90954: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854864.91006: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854864.91010: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854864.91012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854864.91159: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854864.91166: Set connection var ansible_timeout to 10 11389 1726854864.91392: Set connection var ansible_connection to ssh 11389 1726854864.91395: Set connection var ansible_shell_type to sh 11389 1726854864.91398: Set connection var ansible_pipelining to False 11389 1726854864.91401: Set connection var ansible_shell_executable to /bin/sh 11389 1726854864.91403: variable 'ansible_shell_executable' from source: unknown 11389 1726854864.91405: variable 'ansible_connection' from source: unknown 11389 1726854864.91408: variable 'ansible_module_compression' from source: unknown 11389 1726854864.91410: variable 'ansible_shell_type' from source: unknown 11389 1726854864.91412: variable 'ansible_shell_executable' from source: unknown 11389 1726854864.91414: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854864.91416: variable 'ansible_pipelining' from source: unknown 11389 1726854864.91418: variable 'ansible_timeout' from source: unknown 11389 1726854864.91420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854864.91426: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854864.91438: variable 'omit' from source: magic vars 11389 1726854864.91443: starting attempt loop 11389 1726854864.91446: running the handler 11389 1726854864.91459: handler run complete 11389 1726854864.91473: attempt loop complete, returning result 11389 1726854864.91476: _execute() done 11389 1726854864.91479: dumping result to json 11389 1726854864.91481: done dumping result, returning 11389 1726854864.91491: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcc66-ac2b-deb8-c119-0000000003b4] 11389 1726854864.91531: sending task result for task 0affcc66-ac2b-deb8-c119-0000000003b4 11389 1726854864.91613: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000003b4 11389 1726854864.91616: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11389 1726854864.91685: no more pending results, returning what we have 11389 1726854864.91691: results queue empty 11389 1726854864.91692: checking for any_errors_fatal 11389 1726854864.91703: done checking for any_errors_fatal 11389 1726854864.91704: checking for max_fail_percentage 11389 1726854864.91706: done checking for max_fail_percentage 11389 1726854864.91706: checking to see if all hosts have failed and the running result is not ok 11389 1726854864.91708: done checking to see if all hosts have failed 11389 1726854864.91709: getting the remaining hosts for this loop 11389 1726854864.91711: done getting the remaining hosts for this loop 11389 1726854864.91715: getting the next task for host managed_node3 11389 1726854864.91724: done getting next task for host managed_node3 11389 1726854864.91840: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11389 1726854864.91845: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854864.91850: getting variables 11389 1726854864.91851: in VariableManager get_vars() 11389 1726854864.91905: Calling all_inventory to load vars for managed_node3 11389 1726854864.91908: Calling groups_inventory to load vars for managed_node3 11389 1726854864.91911: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854864.91923: Calling all_plugins_play to load vars for managed_node3 11389 1726854864.91927: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854864.91931: Calling groups_plugins_play to load vars for managed_node3 11389 1726854864.93095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854864.93964: done with get_vars() 11389 1726854864.93982: done getting variables 11389 1726854864.94024: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854864.94115: variable 'profile' from source: include params 11389 1726854864.94119: variable 'item' from source: include params 11389 1726854864.94159: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:54:24 -0400 (0:00:00.052) 0:00:17.364 ****** 11389 1726854864.94192: entering _queue_task() for managed_node3/command 11389 1726854864.94466: worker is 1 (out of 1 available) 11389 1726854864.94479: exiting _queue_task() for managed_node3/command 11389 1726854864.94496: done queuing things up, now waiting for results queue to drain 11389 1726854864.94498: waiting for pending results... 11389 1726854864.94725: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 11389 1726854864.95001: in run() - task 0affcc66-ac2b-deb8-c119-0000000003b6 11389 1726854864.95004: variable 'ansible_search_path' from source: unknown 11389 1726854864.95007: variable 'ansible_search_path' from source: unknown 11389 1726854864.95011: calling self._execute() 11389 1726854864.95014: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854864.95017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854864.95020: variable 'omit' from source: magic vars 11389 1726854864.95326: variable 'ansible_distribution_major_version' from source: facts 11389 1726854864.95340: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854864.95457: variable 'profile_stat' from source: set_fact 11389 1726854864.95473: Evaluated conditional (profile_stat.stat.exists): False 11389 1726854864.95476: when evaluation is False, skipping this task 11389 1726854864.95479: _execute() done 11389 1726854864.95482: dumping result to json 11389 1726854864.95484: done dumping result, returning 11389 1726854864.95495: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 [0affcc66-ac2b-deb8-c119-0000000003b6] 11389 1726854864.95501: sending task result for task 0affcc66-ac2b-deb8-c119-0000000003b6 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11389 1726854864.95699: no more pending results, returning what we have 11389 1726854864.95702: results queue empty 11389 1726854864.95703: checking for any_errors_fatal 11389 1726854864.95708: done checking for any_errors_fatal 11389 1726854864.95708: checking for max_fail_percentage 11389 1726854864.95710: done checking for max_fail_percentage 11389 1726854864.95710: checking to see if all hosts have failed and the running result is not ok 11389 1726854864.95711: done checking to see if all hosts have failed 11389 1726854864.95712: getting the remaining hosts for this loop 11389 1726854864.95713: done getting the remaining hosts for this loop 11389 1726854864.95716: getting the next task for host managed_node3 11389 1726854864.95722: done getting next task for host managed_node3 11389 1726854864.95724: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11389 1726854864.95727: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854864.95730: getting variables 11389 1726854864.95731: in VariableManager get_vars() 11389 1726854864.95766: Calling all_inventory to load vars for managed_node3 11389 1726854864.95769: Calling groups_inventory to load vars for managed_node3 11389 1726854864.95771: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854864.95780: Calling all_plugins_play to load vars for managed_node3 11389 1726854864.95782: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854864.95785: Calling groups_plugins_play to load vars for managed_node3 11389 1726854864.96329: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000003b6 11389 1726854864.96332: WORKER PROCESS EXITING 11389 1726854864.96772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854864.97630: done with get_vars() 11389 1726854864.97646: done getting variables 11389 1726854864.97693: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854864.97776: variable 'profile' from source: include params 11389 1726854864.97779: variable 'item' from source: include params 11389 1726854864.97822: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:54:24 -0400 (0:00:00.036) 0:00:17.401 ****** 11389 1726854864.97843: entering _queue_task() for managed_node3/set_fact 11389 1726854864.98148: worker is 1 (out of 1 available) 11389 1726854864.98161: exiting _queue_task() for managed_node3/set_fact 11389 1726854864.98171: done queuing things up, now waiting for results queue to drain 11389 1726854864.98172: waiting for pending results... 11389 1726854864.98504: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 11389 1726854864.98554: in run() - task 0affcc66-ac2b-deb8-c119-0000000003b7 11389 1726854864.98571: variable 'ansible_search_path' from source: unknown 11389 1726854864.98575: variable 'ansible_search_path' from source: unknown 11389 1726854864.98608: calling self._execute() 11389 1726854864.98744: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854864.98747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854864.98751: variable 'omit' from source: magic vars 11389 1726854864.99081: variable 'ansible_distribution_major_version' from source: facts 11389 1726854864.99094: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854864.99212: variable 'profile_stat' from source: set_fact 11389 1726854864.99284: Evaluated conditional (profile_stat.stat.exists): False 11389 1726854864.99291: when evaluation is False, skipping this task 11389 1726854864.99294: _execute() done 11389 1726854864.99296: dumping result to json 11389 1726854864.99298: done dumping result, returning 11389 1726854864.99300: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 [0affcc66-ac2b-deb8-c119-0000000003b7] 11389 1726854864.99302: sending task result for task 0affcc66-ac2b-deb8-c119-0000000003b7 11389 1726854864.99361: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000003b7 11389 1726854864.99364: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11389 1726854864.99431: no more pending results, returning what we have 11389 1726854864.99437: results queue empty 11389 1726854864.99437: checking for any_errors_fatal 11389 1726854864.99444: done checking for any_errors_fatal 11389 1726854864.99445: checking for max_fail_percentage 11389 1726854864.99447: done checking for max_fail_percentage 11389 1726854864.99448: checking to see if all hosts have failed and the running result is not ok 11389 1726854864.99449: done checking to see if all hosts have failed 11389 1726854864.99449: getting the remaining hosts for this loop 11389 1726854864.99450: done getting the remaining hosts for this loop 11389 1726854864.99455: getting the next task for host managed_node3 11389 1726854864.99462: done getting next task for host managed_node3 11389 1726854864.99465: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11389 1726854864.99469: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854864.99475: getting variables 11389 1726854864.99476: in VariableManager get_vars() 11389 1726854864.99635: Calling all_inventory to load vars for managed_node3 11389 1726854864.99638: Calling groups_inventory to load vars for managed_node3 11389 1726854864.99640: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854864.99650: Calling all_plugins_play to load vars for managed_node3 11389 1726854864.99653: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854864.99655: Calling groups_plugins_play to load vars for managed_node3 11389 1726854865.00433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854865.01310: done with get_vars() 11389 1726854865.01335: done getting variables 11389 1726854865.01392: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854865.01498: variable 'profile' from source: include params 11389 1726854865.01502: variable 'item' from source: include params 11389 1726854865.01557: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:54:25 -0400 (0:00:00.037) 0:00:17.438 ****** 11389 1726854865.01585: entering _queue_task() for managed_node3/command 11389 1726854865.01869: worker is 1 (out of 1 available) 11389 1726854865.01882: exiting _queue_task() for managed_node3/command 11389 1726854865.02015: done queuing things up, now waiting for results queue to drain 11389 1726854865.02018: waiting for pending results... 11389 1726854865.02167: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 11389 1726854865.02248: in run() - task 0affcc66-ac2b-deb8-c119-0000000003b8 11389 1726854865.02254: variable 'ansible_search_path' from source: unknown 11389 1726854865.02258: variable 'ansible_search_path' from source: unknown 11389 1726854865.02370: calling self._execute() 11389 1726854865.02375: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.02378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.02400: variable 'omit' from source: magic vars 11389 1726854865.02794: variable 'ansible_distribution_major_version' from source: facts 11389 1726854865.02797: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854865.02854: variable 'profile_stat' from source: set_fact 11389 1726854865.02870: Evaluated conditional (profile_stat.stat.exists): False 11389 1726854865.02874: when evaluation is False, skipping this task 11389 1726854865.02877: _execute() done 11389 1726854865.02879: dumping result to json 11389 1726854865.02881: done dumping result, returning 11389 1726854865.02884: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 [0affcc66-ac2b-deb8-c119-0000000003b8] 11389 1726854865.02915: sending task result for task 0affcc66-ac2b-deb8-c119-0000000003b8 11389 1726854865.02978: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000003b8 11389 1726854865.02980: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11389 1726854865.03060: no more pending results, returning what we have 11389 1726854865.03064: results queue empty 11389 1726854865.03065: checking for any_errors_fatal 11389 1726854865.03075: done checking for any_errors_fatal 11389 1726854865.03076: checking for max_fail_percentage 11389 1726854865.03077: done checking for max_fail_percentage 11389 1726854865.03078: checking to see if all hosts have failed and the running result is not ok 11389 1726854865.03079: done checking to see if all hosts have failed 11389 1726854865.03080: getting the remaining hosts for this loop 11389 1726854865.03081: done getting the remaining hosts for this loop 11389 1726854865.03084: getting the next task for host managed_node3 11389 1726854865.03092: done getting next task for host managed_node3 11389 1726854865.03094: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11389 1726854865.03097: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854865.03101: getting variables 11389 1726854865.03102: in VariableManager get_vars() 11389 1726854865.03135: Calling all_inventory to load vars for managed_node3 11389 1726854865.03138: Calling groups_inventory to load vars for managed_node3 11389 1726854865.03139: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854865.03149: Calling all_plugins_play to load vars for managed_node3 11389 1726854865.03151: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854865.03153: Calling groups_plugins_play to load vars for managed_node3 11389 1726854865.04219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854865.05071: done with get_vars() 11389 1726854865.05089: done getting variables 11389 1726854865.05134: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854865.05216: variable 'profile' from source: include params 11389 1726854865.05219: variable 'item' from source: include params 11389 1726854865.05261: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:54:25 -0400 (0:00:00.036) 0:00:17.475 ****** 11389 1726854865.05284: entering _queue_task() for managed_node3/set_fact 11389 1726854865.05583: worker is 1 (out of 1 available) 11389 1726854865.05599: exiting _queue_task() for managed_node3/set_fact 11389 1726854865.05611: done queuing things up, now waiting for results queue to drain 11389 1726854865.05612: waiting for pending results... 11389 1726854865.05927: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 11389 1726854865.05995: in run() - task 0affcc66-ac2b-deb8-c119-0000000003b9 11389 1726854865.06023: variable 'ansible_search_path' from source: unknown 11389 1726854865.06026: variable 'ansible_search_path' from source: unknown 11389 1726854865.06055: calling self._execute() 11389 1726854865.06194: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.06197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.06200: variable 'omit' from source: magic vars 11389 1726854865.06646: variable 'ansible_distribution_major_version' from source: facts 11389 1726854865.06656: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854865.06742: variable 'profile_stat' from source: set_fact 11389 1726854865.06753: Evaluated conditional (profile_stat.stat.exists): False 11389 1726854865.06756: when evaluation is False, skipping this task 11389 1726854865.06759: _execute() done 11389 1726854865.06761: dumping result to json 11389 1726854865.06764: done dumping result, returning 11389 1726854865.06770: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 [0affcc66-ac2b-deb8-c119-0000000003b9] 11389 1726854865.06777: sending task result for task 0affcc66-ac2b-deb8-c119-0000000003b9 11389 1726854865.06870: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000003b9 11389 1726854865.06872: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11389 1726854865.06932: no more pending results, returning what we have 11389 1726854865.06936: results queue empty 11389 1726854865.06937: checking for any_errors_fatal 11389 1726854865.06942: done checking for any_errors_fatal 11389 1726854865.06942: checking for max_fail_percentage 11389 1726854865.06944: done checking for max_fail_percentage 11389 1726854865.06945: checking to see if all hosts have failed and the running result is not ok 11389 1726854865.06946: done checking to see if all hosts have failed 11389 1726854865.06947: getting the remaining hosts for this loop 11389 1726854865.06948: done getting the remaining hosts for this loop 11389 1726854865.06952: getting the next task for host managed_node3 11389 1726854865.06960: done getting next task for host managed_node3 11389 1726854865.06962: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11389 1726854865.06968: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854865.06972: getting variables 11389 1726854865.06973: in VariableManager get_vars() 11389 1726854865.07012: Calling all_inventory to load vars for managed_node3 11389 1726854865.07014: Calling groups_inventory to load vars for managed_node3 11389 1726854865.07016: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854865.07026: Calling all_plugins_play to load vars for managed_node3 11389 1726854865.07028: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854865.07030: Calling groups_plugins_play to load vars for managed_node3 11389 1726854865.07806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854865.08774: done with get_vars() 11389 1726854865.08792: done getting variables 11389 1726854865.08838: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854865.08927: variable 'profile' from source: include params 11389 1726854865.08930: variable 'item' from source: include params 11389 1726854865.08975: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:54:25 -0400 (0:00:00.037) 0:00:17.513 ****** 11389 1726854865.09000: entering _queue_task() for managed_node3/assert 11389 1726854865.09248: worker is 1 (out of 1 available) 11389 1726854865.09262: exiting _queue_task() for managed_node3/assert 11389 1726854865.09276: done queuing things up, now waiting for results queue to drain 11389 1726854865.09277: waiting for pending results... 11389 1726854865.09445: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' 11389 1726854865.09512: in run() - task 0affcc66-ac2b-deb8-c119-000000000260 11389 1726854865.09523: variable 'ansible_search_path' from source: unknown 11389 1726854865.09527: variable 'ansible_search_path' from source: unknown 11389 1726854865.09553: calling self._execute() 11389 1726854865.09626: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.09630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.09639: variable 'omit' from source: magic vars 11389 1726854865.09901: variable 'ansible_distribution_major_version' from source: facts 11389 1726854865.09910: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854865.09917: variable 'omit' from source: magic vars 11389 1726854865.09945: variable 'omit' from source: magic vars 11389 1726854865.10014: variable 'profile' from source: include params 11389 1726854865.10018: variable 'item' from source: include params 11389 1726854865.10064: variable 'item' from source: include params 11389 1726854865.10079: variable 'omit' from source: magic vars 11389 1726854865.10114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854865.10140: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854865.10160: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854865.10172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854865.10183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854865.10209: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854865.10212: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.10214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.10285: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854865.10293: Set connection var ansible_timeout to 10 11389 1726854865.10295: Set connection var ansible_connection to ssh 11389 1726854865.10300: Set connection var ansible_shell_type to sh 11389 1726854865.10305: Set connection var ansible_pipelining to False 11389 1726854865.10310: Set connection var ansible_shell_executable to /bin/sh 11389 1726854865.10325: variable 'ansible_shell_executable' from source: unknown 11389 1726854865.10328: variable 'ansible_connection' from source: unknown 11389 1726854865.10330: variable 'ansible_module_compression' from source: unknown 11389 1726854865.10332: variable 'ansible_shell_type' from source: unknown 11389 1726854865.10334: variable 'ansible_shell_executable' from source: unknown 11389 1726854865.10336: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.10341: variable 'ansible_pipelining' from source: unknown 11389 1726854865.10344: variable 'ansible_timeout' from source: unknown 11389 1726854865.10347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.10445: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854865.10454: variable 'omit' from source: magic vars 11389 1726854865.10459: starting attempt loop 11389 1726854865.10462: running the handler 11389 1726854865.10539: variable 'lsr_net_profile_exists' from source: set_fact 11389 1726854865.10542: Evaluated conditional (lsr_net_profile_exists): True 11389 1726854865.10549: handler run complete 11389 1726854865.10560: attempt loop complete, returning result 11389 1726854865.10563: _execute() done 11389 1726854865.10568: dumping result to json 11389 1726854865.10571: done dumping result, returning 11389 1726854865.10574: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' [0affcc66-ac2b-deb8-c119-000000000260] 11389 1726854865.10580: sending task result for task 0affcc66-ac2b-deb8-c119-000000000260 11389 1726854865.10657: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000260 11389 1726854865.10660: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11389 1726854865.10743: no more pending results, returning what we have 11389 1726854865.10746: results queue empty 11389 1726854865.10747: checking for any_errors_fatal 11389 1726854865.10753: done checking for any_errors_fatal 11389 1726854865.10753: checking for max_fail_percentage 11389 1726854865.10755: done checking for max_fail_percentage 11389 1726854865.10756: checking to see if all hosts have failed and the running result is not ok 11389 1726854865.10757: done checking to see if all hosts have failed 11389 1726854865.10757: getting the remaining hosts for this loop 11389 1726854865.10758: done getting the remaining hosts for this loop 11389 1726854865.10761: getting the next task for host managed_node3 11389 1726854865.10769: done getting next task for host managed_node3 11389 1726854865.10772: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11389 1726854865.10775: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854865.10777: getting variables 11389 1726854865.10779: in VariableManager get_vars() 11389 1726854865.10814: Calling all_inventory to load vars for managed_node3 11389 1726854865.10817: Calling groups_inventory to load vars for managed_node3 11389 1726854865.10819: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854865.10828: Calling all_plugins_play to load vars for managed_node3 11389 1726854865.10830: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854865.10833: Calling groups_plugins_play to load vars for managed_node3 11389 1726854865.11600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854865.12468: done with get_vars() 11389 1726854865.12485: done getting variables 11389 1726854865.12530: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854865.12615: variable 'profile' from source: include params 11389 1726854865.12618: variable 'item' from source: include params 11389 1726854865.12661: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:54:25 -0400 (0:00:00.036) 0:00:17.549 ****** 11389 1726854865.12691: entering _queue_task() for managed_node3/assert 11389 1726854865.12925: worker is 1 (out of 1 available) 11389 1726854865.12940: exiting _queue_task() for managed_node3/assert 11389 1726854865.12952: done queuing things up, now waiting for results queue to drain 11389 1726854865.12953: waiting for pending results... 11389 1726854865.13123: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' 11389 1726854865.13193: in run() - task 0affcc66-ac2b-deb8-c119-000000000261 11389 1726854865.13206: variable 'ansible_search_path' from source: unknown 11389 1726854865.13209: variable 'ansible_search_path' from source: unknown 11389 1726854865.13237: calling self._execute() 11389 1726854865.13309: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.13312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.13321: variable 'omit' from source: magic vars 11389 1726854865.13573: variable 'ansible_distribution_major_version' from source: facts 11389 1726854865.13582: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854865.13590: variable 'omit' from source: magic vars 11389 1726854865.13622: variable 'omit' from source: magic vars 11389 1726854865.13688: variable 'profile' from source: include params 11389 1726854865.13693: variable 'item' from source: include params 11389 1726854865.13738: variable 'item' from source: include params 11389 1726854865.13752: variable 'omit' from source: magic vars 11389 1726854865.13785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854865.13814: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854865.13828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854865.13846: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854865.13857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854865.13880: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854865.13885: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.13889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.13957: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854865.13964: Set connection var ansible_timeout to 10 11389 1726854865.13969: Set connection var ansible_connection to ssh 11389 1726854865.13971: Set connection var ansible_shell_type to sh 11389 1726854865.13976: Set connection var ansible_pipelining to False 11389 1726854865.13981: Set connection var ansible_shell_executable to /bin/sh 11389 1726854865.13998: variable 'ansible_shell_executable' from source: unknown 11389 1726854865.14001: variable 'ansible_connection' from source: unknown 11389 1726854865.14003: variable 'ansible_module_compression' from source: unknown 11389 1726854865.14006: variable 'ansible_shell_type' from source: unknown 11389 1726854865.14008: variable 'ansible_shell_executable' from source: unknown 11389 1726854865.14010: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.14013: variable 'ansible_pipelining' from source: unknown 11389 1726854865.14015: variable 'ansible_timeout' from source: unknown 11389 1726854865.14020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.14117: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854865.14126: variable 'omit' from source: magic vars 11389 1726854865.14131: starting attempt loop 11389 1726854865.14133: running the handler 11389 1726854865.14209: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11389 1726854865.14213: Evaluated conditional (lsr_net_profile_ansible_managed): True 11389 1726854865.14219: handler run complete 11389 1726854865.14230: attempt loop complete, returning result 11389 1726854865.14233: _execute() done 11389 1726854865.14235: dumping result to json 11389 1726854865.14238: done dumping result, returning 11389 1726854865.14245: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' [0affcc66-ac2b-deb8-c119-000000000261] 11389 1726854865.14250: sending task result for task 0affcc66-ac2b-deb8-c119-000000000261 11389 1726854865.14331: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000261 11389 1726854865.14334: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11389 1726854865.14424: no more pending results, returning what we have 11389 1726854865.14427: results queue empty 11389 1726854865.14427: checking for any_errors_fatal 11389 1726854865.14431: done checking for any_errors_fatal 11389 1726854865.14432: checking for max_fail_percentage 11389 1726854865.14433: done checking for max_fail_percentage 11389 1726854865.14434: checking to see if all hosts have failed and the running result is not ok 11389 1726854865.14435: done checking to see if all hosts have failed 11389 1726854865.14436: getting the remaining hosts for this loop 11389 1726854865.14437: done getting the remaining hosts for this loop 11389 1726854865.14440: getting the next task for host managed_node3 11389 1726854865.14448: done getting next task for host managed_node3 11389 1726854865.14450: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11389 1726854865.14453: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854865.14456: getting variables 11389 1726854865.14457: in VariableManager get_vars() 11389 1726854865.14493: Calling all_inventory to load vars for managed_node3 11389 1726854865.14496: Calling groups_inventory to load vars for managed_node3 11389 1726854865.14498: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854865.14506: Calling all_plugins_play to load vars for managed_node3 11389 1726854865.14509: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854865.14511: Calling groups_plugins_play to load vars for managed_node3 11389 1726854865.15862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854865.17407: done with get_vars() 11389 1726854865.17432: done getting variables 11389 1726854865.17503: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854865.17623: variable 'profile' from source: include params 11389 1726854865.17627: variable 'item' from source: include params 11389 1726854865.17693: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:54:25 -0400 (0:00:00.050) 0:00:17.600 ****** 11389 1726854865.17731: entering _queue_task() for managed_node3/assert 11389 1726854865.18070: worker is 1 (out of 1 available) 11389 1726854865.18084: exiting _queue_task() for managed_node3/assert 11389 1726854865.18099: done queuing things up, now waiting for results queue to drain 11389 1726854865.18101: waiting for pending results... 11389 1726854865.18507: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 11389 1726854865.18513: in run() - task 0affcc66-ac2b-deb8-c119-000000000262 11389 1726854865.18531: variable 'ansible_search_path' from source: unknown 11389 1726854865.18540: variable 'ansible_search_path' from source: unknown 11389 1726854865.18586: calling self._execute() 11389 1726854865.18698: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.18716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.18735: variable 'omit' from source: magic vars 11389 1726854865.19143: variable 'ansible_distribution_major_version' from source: facts 11389 1726854865.19169: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854865.19184: variable 'omit' from source: magic vars 11389 1726854865.19227: variable 'omit' from source: magic vars 11389 1726854865.19341: variable 'profile' from source: include params 11389 1726854865.19351: variable 'item' from source: include params 11389 1726854865.19424: variable 'item' from source: include params 11389 1726854865.19448: variable 'omit' from source: magic vars 11389 1726854865.19504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854865.19544: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854865.19558: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854865.19574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854865.19584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854865.19613: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854865.19616: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.19619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.19691: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854865.19700: Set connection var ansible_timeout to 10 11389 1726854865.19702: Set connection var ansible_connection to ssh 11389 1726854865.19705: Set connection var ansible_shell_type to sh 11389 1726854865.19711: Set connection var ansible_pipelining to False 11389 1726854865.19716: Set connection var ansible_shell_executable to /bin/sh 11389 1726854865.19738: variable 'ansible_shell_executable' from source: unknown 11389 1726854865.19741: variable 'ansible_connection' from source: unknown 11389 1726854865.19744: variable 'ansible_module_compression' from source: unknown 11389 1726854865.19746: variable 'ansible_shell_type' from source: unknown 11389 1726854865.19748: variable 'ansible_shell_executable' from source: unknown 11389 1726854865.19750: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.19752: variable 'ansible_pipelining' from source: unknown 11389 1726854865.19754: variable 'ansible_timeout' from source: unknown 11389 1726854865.19756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.19857: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854865.19865: variable 'omit' from source: magic vars 11389 1726854865.19873: starting attempt loop 11389 1726854865.19876: running the handler 11389 1726854865.19954: variable 'lsr_net_profile_fingerprint' from source: set_fact 11389 1726854865.19958: Evaluated conditional (lsr_net_profile_fingerprint): True 11389 1726854865.19961: handler run complete 11389 1726854865.19973: attempt loop complete, returning result 11389 1726854865.19976: _execute() done 11389 1726854865.19978: dumping result to json 11389 1726854865.19981: done dumping result, returning 11389 1726854865.19989: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 [0affcc66-ac2b-deb8-c119-000000000262] 11389 1726854865.19994: sending task result for task 0affcc66-ac2b-deb8-c119-000000000262 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11389 1726854865.20118: no more pending results, returning what we have 11389 1726854865.20121: results queue empty 11389 1726854865.20122: checking for any_errors_fatal 11389 1726854865.20127: done checking for any_errors_fatal 11389 1726854865.20128: checking for max_fail_percentage 11389 1726854865.20130: done checking for max_fail_percentage 11389 1726854865.20130: checking to see if all hosts have failed and the running result is not ok 11389 1726854865.20131: done checking to see if all hosts have failed 11389 1726854865.20132: getting the remaining hosts for this loop 11389 1726854865.20133: done getting the remaining hosts for this loop 11389 1726854865.20136: getting the next task for host managed_node3 11389 1726854865.20145: done getting next task for host managed_node3 11389 1726854865.20148: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11389 1726854865.20151: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854865.20155: getting variables 11389 1726854865.20157: in VariableManager get_vars() 11389 1726854865.20198: Calling all_inventory to load vars for managed_node3 11389 1726854865.20201: Calling groups_inventory to load vars for managed_node3 11389 1726854865.20203: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854865.20214: Calling all_plugins_play to load vars for managed_node3 11389 1726854865.20217: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854865.20219: Calling groups_plugins_play to load vars for managed_node3 11389 1726854865.21020: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000262 11389 1726854865.21024: WORKER PROCESS EXITING 11389 1726854865.21035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854865.21893: done with get_vars() 11389 1726854865.21911: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:54:25 -0400 (0:00:00.042) 0:00:17.642 ****** 11389 1726854865.21979: entering _queue_task() for managed_node3/include_tasks 11389 1726854865.22247: worker is 1 (out of 1 available) 11389 1726854865.22262: exiting _queue_task() for managed_node3/include_tasks 11389 1726854865.22273: done queuing things up, now waiting for results queue to drain 11389 1726854865.22275: waiting for pending results... 11389 1726854865.22446: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 11389 1726854865.22521: in run() - task 0affcc66-ac2b-deb8-c119-000000000266 11389 1726854865.22532: variable 'ansible_search_path' from source: unknown 11389 1726854865.22536: variable 'ansible_search_path' from source: unknown 11389 1726854865.22563: calling self._execute() 11389 1726854865.22635: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.22640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.22649: variable 'omit' from source: magic vars 11389 1726854865.22918: variable 'ansible_distribution_major_version' from source: facts 11389 1726854865.22928: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854865.22935: _execute() done 11389 1726854865.22938: dumping result to json 11389 1726854865.22941: done dumping result, returning 11389 1726854865.22950: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affcc66-ac2b-deb8-c119-000000000266] 11389 1726854865.22953: sending task result for task 0affcc66-ac2b-deb8-c119-000000000266 11389 1726854865.23039: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000266 11389 1726854865.23041: WORKER PROCESS EXITING 11389 1726854865.23078: no more pending results, returning what we have 11389 1726854865.23083: in VariableManager get_vars() 11389 1726854865.23134: Calling all_inventory to load vars for managed_node3 11389 1726854865.23137: Calling groups_inventory to load vars for managed_node3 11389 1726854865.23139: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854865.23150: Calling all_plugins_play to load vars for managed_node3 11389 1726854865.23153: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854865.23156: Calling groups_plugins_play to load vars for managed_node3 11389 1726854865.24316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854865.25381: done with get_vars() 11389 1726854865.25403: variable 'ansible_search_path' from source: unknown 11389 1726854865.25404: variable 'ansible_search_path' from source: unknown 11389 1726854865.25431: we have included files to process 11389 1726854865.25431: generating all_blocks data 11389 1726854865.25433: done generating all_blocks data 11389 1726854865.25435: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11389 1726854865.25436: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11389 1726854865.25437: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11389 1726854865.26037: done processing included file 11389 1726854865.26039: iterating over new_blocks loaded from include file 11389 1726854865.26040: in VariableManager get_vars() 11389 1726854865.26054: done with get_vars() 11389 1726854865.26055: filtering new block on tags 11389 1726854865.26072: done filtering new block on tags 11389 1726854865.26073: in VariableManager get_vars() 11389 1726854865.26085: done with get_vars() 11389 1726854865.26086: filtering new block on tags 11389 1726854865.26101: done filtering new block on tags 11389 1726854865.26102: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 11389 1726854865.26107: extending task lists for all hosts with included blocks 11389 1726854865.26208: done extending task lists 11389 1726854865.26209: done processing included files 11389 1726854865.26209: results queue empty 11389 1726854865.26210: checking for any_errors_fatal 11389 1726854865.26212: done checking for any_errors_fatal 11389 1726854865.26213: checking for max_fail_percentage 11389 1726854865.26214: done checking for max_fail_percentage 11389 1726854865.26215: checking to see if all hosts have failed and the running result is not ok 11389 1726854865.26216: done checking to see if all hosts have failed 11389 1726854865.26217: getting the remaining hosts for this loop 11389 1726854865.26217: done getting the remaining hosts for this loop 11389 1726854865.26219: getting the next task for host managed_node3 11389 1726854865.26222: done getting next task for host managed_node3 11389 1726854865.26223: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11389 1726854865.26225: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854865.26227: getting variables 11389 1726854865.26228: in VariableManager get_vars() 11389 1726854865.26237: Calling all_inventory to load vars for managed_node3 11389 1726854865.26239: Calling groups_inventory to load vars for managed_node3 11389 1726854865.26240: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854865.26244: Calling all_plugins_play to load vars for managed_node3 11389 1726854865.26246: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854865.26247: Calling groups_plugins_play to load vars for managed_node3 11389 1726854865.27128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854865.28666: done with get_vars() 11389 1726854865.28691: done getting variables 11389 1726854865.28739: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:54:25 -0400 (0:00:00.067) 0:00:17.710 ****** 11389 1726854865.28771: entering _queue_task() for managed_node3/set_fact 11389 1726854865.29120: worker is 1 (out of 1 available) 11389 1726854865.29133: exiting _queue_task() for managed_node3/set_fact 11389 1726854865.29145: done queuing things up, now waiting for results queue to drain 11389 1726854865.29146: waiting for pending results... 11389 1726854865.29399: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 11389 1726854865.29522: in run() - task 0affcc66-ac2b-deb8-c119-0000000003f8 11389 1726854865.29543: variable 'ansible_search_path' from source: unknown 11389 1726854865.29551: variable 'ansible_search_path' from source: unknown 11389 1726854865.29592: calling self._execute() 11389 1726854865.29692: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.29707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.29793: variable 'omit' from source: magic vars 11389 1726854865.30090: variable 'ansible_distribution_major_version' from source: facts 11389 1726854865.30108: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854865.30119: variable 'omit' from source: magic vars 11389 1726854865.30178: variable 'omit' from source: magic vars 11389 1726854865.30221: variable 'omit' from source: magic vars 11389 1726854865.30269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854865.30312: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854865.30337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854865.30363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854865.30380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854865.30415: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854865.30423: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.30430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.30576: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854865.30579: Set connection var ansible_timeout to 10 11389 1726854865.30582: Set connection var ansible_connection to ssh 11389 1726854865.30584: Set connection var ansible_shell_type to sh 11389 1726854865.30586: Set connection var ansible_pipelining to False 11389 1726854865.30590: Set connection var ansible_shell_executable to /bin/sh 11389 1726854865.30604: variable 'ansible_shell_executable' from source: unknown 11389 1726854865.30611: variable 'ansible_connection' from source: unknown 11389 1726854865.30618: variable 'ansible_module_compression' from source: unknown 11389 1726854865.30624: variable 'ansible_shell_type' from source: unknown 11389 1726854865.30630: variable 'ansible_shell_executable' from source: unknown 11389 1726854865.30684: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.30690: variable 'ansible_pipelining' from source: unknown 11389 1726854865.30692: variable 'ansible_timeout' from source: unknown 11389 1726854865.30695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.30807: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854865.30824: variable 'omit' from source: magic vars 11389 1726854865.30834: starting attempt loop 11389 1726854865.30841: running the handler 11389 1726854865.30859: handler run complete 11389 1726854865.30874: attempt loop complete, returning result 11389 1726854865.30881: _execute() done 11389 1726854865.30901: dumping result to json 11389 1726854865.30903: done dumping result, returning 11389 1726854865.30911: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcc66-ac2b-deb8-c119-0000000003f8] 11389 1726854865.30992: sending task result for task 0affcc66-ac2b-deb8-c119-0000000003f8 11389 1726854865.31049: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000003f8 11389 1726854865.31051: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11389 1726854865.31107: no more pending results, returning what we have 11389 1726854865.31110: results queue empty 11389 1726854865.31111: checking for any_errors_fatal 11389 1726854865.31113: done checking for any_errors_fatal 11389 1726854865.31113: checking for max_fail_percentage 11389 1726854865.31115: done checking for max_fail_percentage 11389 1726854865.31116: checking to see if all hosts have failed and the running result is not ok 11389 1726854865.31117: done checking to see if all hosts have failed 11389 1726854865.31117: getting the remaining hosts for this loop 11389 1726854865.31119: done getting the remaining hosts for this loop 11389 1726854865.31122: getting the next task for host managed_node3 11389 1726854865.31129: done getting next task for host managed_node3 11389 1726854865.31132: ^ task is: TASK: Stat profile file 11389 1726854865.31136: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854865.31140: getting variables 11389 1726854865.31142: in VariableManager get_vars() 11389 1726854865.31182: Calling all_inventory to load vars for managed_node3 11389 1726854865.31185: Calling groups_inventory to load vars for managed_node3 11389 1726854865.31189: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854865.31201: Calling all_plugins_play to load vars for managed_node3 11389 1726854865.31204: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854865.31207: Calling groups_plugins_play to load vars for managed_node3 11389 1726854865.32808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854865.34266: done with get_vars() 11389 1726854865.34286: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:54:25 -0400 (0:00:00.055) 0:00:17.766 ****** 11389 1726854865.34355: entering _queue_task() for managed_node3/stat 11389 1726854865.34603: worker is 1 (out of 1 available) 11389 1726854865.34617: exiting _queue_task() for managed_node3/stat 11389 1726854865.34629: done queuing things up, now waiting for results queue to drain 11389 1726854865.34631: waiting for pending results... 11389 1726854865.34810: running TaskExecutor() for managed_node3/TASK: Stat profile file 11389 1726854865.34895: in run() - task 0affcc66-ac2b-deb8-c119-0000000003f9 11389 1726854865.34907: variable 'ansible_search_path' from source: unknown 11389 1726854865.34911: variable 'ansible_search_path' from source: unknown 11389 1726854865.34942: calling self._execute() 11389 1726854865.35013: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.35017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.35030: variable 'omit' from source: magic vars 11389 1726854865.35294: variable 'ansible_distribution_major_version' from source: facts 11389 1726854865.35305: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854865.35311: variable 'omit' from source: magic vars 11389 1726854865.35342: variable 'omit' from source: magic vars 11389 1726854865.35416: variable 'profile' from source: include params 11389 1726854865.35420: variable 'item' from source: include params 11389 1726854865.35466: variable 'item' from source: include params 11389 1726854865.35483: variable 'omit' from source: magic vars 11389 1726854865.35519: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854865.35545: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854865.35561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854865.35578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854865.35590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854865.35614: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854865.35617: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.35620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.35691: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854865.35698: Set connection var ansible_timeout to 10 11389 1726854865.35701: Set connection var ansible_connection to ssh 11389 1726854865.35706: Set connection var ansible_shell_type to sh 11389 1726854865.35711: Set connection var ansible_pipelining to False 11389 1726854865.35715: Set connection var ansible_shell_executable to /bin/sh 11389 1726854865.35734: variable 'ansible_shell_executable' from source: unknown 11389 1726854865.35737: variable 'ansible_connection' from source: unknown 11389 1726854865.35740: variable 'ansible_module_compression' from source: unknown 11389 1726854865.35742: variable 'ansible_shell_type' from source: unknown 11389 1726854865.35745: variable 'ansible_shell_executable' from source: unknown 11389 1726854865.35747: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.35749: variable 'ansible_pipelining' from source: unknown 11389 1726854865.35752: variable 'ansible_timeout' from source: unknown 11389 1726854865.35756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.35920: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11389 1726854865.35934: variable 'omit' from source: magic vars 11389 1726854865.35937: starting attempt loop 11389 1726854865.35940: running the handler 11389 1726854865.35952: _low_level_execute_command(): starting 11389 1726854865.35969: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854865.36794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854865.36800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854865.36802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854865.36804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854865.36871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854865.38608: stdout chunk (state=3): >>>/root <<< 11389 1726854865.38709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854865.38753: stderr chunk (state=3): >>><<< 11389 1726854865.38757: stdout chunk (state=3): >>><<< 11389 1726854865.38775: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854865.38788: _low_level_execute_command(): starting 11389 1726854865.38796: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854865.3877587-12310-166062360471183 `" && echo ansible-tmp-1726854865.3877587-12310-166062360471183="` echo /root/.ansible/tmp/ansible-tmp-1726854865.3877587-12310-166062360471183 `" ) && sleep 0' 11389 1726854865.39402: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854865.39450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854865.39454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854865.39456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854865.39530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854865.41447: stdout chunk (state=3): >>>ansible-tmp-1726854865.3877587-12310-166062360471183=/root/.ansible/tmp/ansible-tmp-1726854865.3877587-12310-166062360471183 <<< 11389 1726854865.41576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854865.41602: stderr chunk (state=3): >>><<< 11389 1726854865.41605: stdout chunk (state=3): >>><<< 11389 1726854865.41621: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854865.3877587-12310-166062360471183=/root/.ansible/tmp/ansible-tmp-1726854865.3877587-12310-166062360471183 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854865.41659: variable 'ansible_module_compression' from source: unknown 11389 1726854865.41719: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11389 1726854865.41755: variable 'ansible_facts' from source: unknown 11389 1726854865.41849: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854865.3877587-12310-166062360471183/AnsiballZ_stat.py 11389 1726854865.42046: Sending initial data 11389 1726854865.42049: Sent initial data (153 bytes) 11389 1726854865.42579: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854865.42598: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854865.42684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854865.42690: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854865.42747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854865.42864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854865.44405: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854865.44457: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854865.44595: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmpdm340_p5 /root/.ansible/tmp/ansible-tmp-1726854865.3877587-12310-166062360471183/AnsiballZ_stat.py <<< 11389 1726854865.44602: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854865.3877587-12310-166062360471183/AnsiballZ_stat.py" <<< 11389 1726854865.44653: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmpdm340_p5" to remote "/root/.ansible/tmp/ansible-tmp-1726854865.3877587-12310-166062360471183/AnsiballZ_stat.py" <<< 11389 1726854865.44656: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854865.3877587-12310-166062360471183/AnsiballZ_stat.py" <<< 11389 1726854865.45393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854865.45397: stdout chunk (state=3): >>><<< 11389 1726854865.45399: stderr chunk (state=3): >>><<< 11389 1726854865.45529: done transferring module to remote 11389 1726854865.45532: _low_level_execute_command(): starting 11389 1726854865.45535: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854865.3877587-12310-166062360471183/ /root/.ansible/tmp/ansible-tmp-1726854865.3877587-12310-166062360471183/AnsiballZ_stat.py && sleep 0' 11389 1726854865.46046: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854865.46059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854865.46074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854865.46086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854865.46130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854865.46134: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854865.46209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854865.48020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854865.48045: stderr chunk (state=3): >>><<< 11389 1726854865.48048: stdout chunk (state=3): >>><<< 11389 1726854865.48061: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854865.48065: _low_level_execute_command(): starting 11389 1726854865.48070: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854865.3877587-12310-166062360471183/AnsiballZ_stat.py && sleep 0' 11389 1726854865.48493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854865.48496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854865.48499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854865.48501: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 11389 1726854865.48504: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854865.48515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854865.48569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854865.48576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854865.48639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854865.66571: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11389 1726854865.67696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854865.67700: stdout chunk (state=3): >>><<< 11389 1726854865.67847: stderr chunk (state=3): >>><<< 11389 1726854865.67851: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854865.67856: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854865.3877587-12310-166062360471183/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854865.67859: _low_level_execute_command(): starting 11389 1726854865.67861: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854865.3877587-12310-166062360471183/ > /dev/null 2>&1 && sleep 0' 11389 1726854865.68430: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854865.68446: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854865.68460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854865.68529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854865.68579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854865.68608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854865.68621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854865.68715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854865.70625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854865.70651: stdout chunk (state=3): >>><<< 11389 1726854865.70655: stderr chunk (state=3): >>><<< 11389 1726854865.70793: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854865.70796: handler run complete 11389 1726854865.70799: attempt loop complete, returning result 11389 1726854865.70801: _execute() done 11389 1726854865.70803: dumping result to json 11389 1726854865.70806: done dumping result, returning 11389 1726854865.70809: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affcc66-ac2b-deb8-c119-0000000003f9] 11389 1726854865.70811: sending task result for task 0affcc66-ac2b-deb8-c119-0000000003f9 11389 1726854865.70878: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000003f9 11389 1726854865.70882: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 11389 1726854865.70943: no more pending results, returning what we have 11389 1726854865.70947: results queue empty 11389 1726854865.70947: checking for any_errors_fatal 11389 1726854865.70953: done checking for any_errors_fatal 11389 1726854865.70954: checking for max_fail_percentage 11389 1726854865.70956: done checking for max_fail_percentage 11389 1726854865.70957: checking to see if all hosts have failed and the running result is not ok 11389 1726854865.70958: done checking to see if all hosts have failed 11389 1726854865.70958: getting the remaining hosts for this loop 11389 1726854865.70960: done getting the remaining hosts for this loop 11389 1726854865.70963: getting the next task for host managed_node3 11389 1726854865.70972: done getting next task for host managed_node3 11389 1726854865.70975: ^ task is: TASK: Set NM profile exist flag based on the profile files 11389 1726854865.70979: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854865.70983: getting variables 11389 1726854865.70985: in VariableManager get_vars() 11389 1726854865.71025: Calling all_inventory to load vars for managed_node3 11389 1726854865.71027: Calling groups_inventory to load vars for managed_node3 11389 1726854865.71030: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854865.71041: Calling all_plugins_play to load vars for managed_node3 11389 1726854865.71044: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854865.71046: Calling groups_plugins_play to load vars for managed_node3 11389 1726854865.72176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854865.76539: done with get_vars() 11389 1726854865.76562: done getting variables 11389 1726854865.76608: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:54:25 -0400 (0:00:00.422) 0:00:18.189 ****** 11389 1726854865.76637: entering _queue_task() for managed_node3/set_fact 11389 1726854865.76975: worker is 1 (out of 1 available) 11389 1726854865.76989: exiting _queue_task() for managed_node3/set_fact 11389 1726854865.77002: done queuing things up, now waiting for results queue to drain 11389 1726854865.77004: waiting for pending results... 11389 1726854865.77308: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 11389 1726854865.77423: in run() - task 0affcc66-ac2b-deb8-c119-0000000003fa 11389 1726854865.77445: variable 'ansible_search_path' from source: unknown 11389 1726854865.77454: variable 'ansible_search_path' from source: unknown 11389 1726854865.77498: calling self._execute() 11389 1726854865.77596: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.77609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.77635: variable 'omit' from source: magic vars 11389 1726854865.77942: variable 'ansible_distribution_major_version' from source: facts 11389 1726854865.77951: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854865.78035: variable 'profile_stat' from source: set_fact 11389 1726854865.78047: Evaluated conditional (profile_stat.stat.exists): False 11389 1726854865.78052: when evaluation is False, skipping this task 11389 1726854865.78055: _execute() done 11389 1726854865.78057: dumping result to json 11389 1726854865.78060: done dumping result, returning 11389 1726854865.78064: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affcc66-ac2b-deb8-c119-0000000003fa] 11389 1726854865.78070: sending task result for task 0affcc66-ac2b-deb8-c119-0000000003fa 11389 1726854865.78161: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000003fa 11389 1726854865.78164: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11389 1726854865.78226: no more pending results, returning what we have 11389 1726854865.78230: results queue empty 11389 1726854865.78231: checking for any_errors_fatal 11389 1726854865.78240: done checking for any_errors_fatal 11389 1726854865.78240: checking for max_fail_percentage 11389 1726854865.78242: done checking for max_fail_percentage 11389 1726854865.78243: checking to see if all hosts have failed and the running result is not ok 11389 1726854865.78244: done checking to see if all hosts have failed 11389 1726854865.78245: getting the remaining hosts for this loop 11389 1726854865.78246: done getting the remaining hosts for this loop 11389 1726854865.78249: getting the next task for host managed_node3 11389 1726854865.78256: done getting next task for host managed_node3 11389 1726854865.78258: ^ task is: TASK: Get NM profile info 11389 1726854865.78262: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854865.78269: getting variables 11389 1726854865.78271: in VariableManager get_vars() 11389 1726854865.78312: Calling all_inventory to load vars for managed_node3 11389 1726854865.78315: Calling groups_inventory to load vars for managed_node3 11389 1726854865.78316: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854865.78326: Calling all_plugins_play to load vars for managed_node3 11389 1726854865.78328: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854865.78330: Calling groups_plugins_play to load vars for managed_node3 11389 1726854865.79086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854865.80435: done with get_vars() 11389 1726854865.80458: done getting variables 11389 1726854865.80520: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:54:25 -0400 (0:00:00.039) 0:00:18.228 ****** 11389 1726854865.80552: entering _queue_task() for managed_node3/shell 11389 1726854865.80848: worker is 1 (out of 1 available) 11389 1726854865.80858: exiting _queue_task() for managed_node3/shell 11389 1726854865.80871: done queuing things up, now waiting for results queue to drain 11389 1726854865.80873: waiting for pending results... 11389 1726854865.81163: running TaskExecutor() for managed_node3/TASK: Get NM profile info 11389 1726854865.81245: in run() - task 0affcc66-ac2b-deb8-c119-0000000003fb 11389 1726854865.81258: variable 'ansible_search_path' from source: unknown 11389 1726854865.81267: variable 'ansible_search_path' from source: unknown 11389 1726854865.81299: calling self._execute() 11389 1726854865.81379: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.81383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.81393: variable 'omit' from source: magic vars 11389 1726854865.81674: variable 'ansible_distribution_major_version' from source: facts 11389 1726854865.81683: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854865.81691: variable 'omit' from source: magic vars 11389 1726854865.81727: variable 'omit' from source: magic vars 11389 1726854865.81798: variable 'profile' from source: include params 11389 1726854865.81802: variable 'item' from source: include params 11389 1726854865.81846: variable 'item' from source: include params 11389 1726854865.81861: variable 'omit' from source: magic vars 11389 1726854865.81899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854865.81925: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854865.81943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854865.81957: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854865.81967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854865.81996: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854865.81999: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.82002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.82068: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854865.82077: Set connection var ansible_timeout to 10 11389 1726854865.82080: Set connection var ansible_connection to ssh 11389 1726854865.82091: Set connection var ansible_shell_type to sh 11389 1726854865.82094: Set connection var ansible_pipelining to False 11389 1726854865.82096: Set connection var ansible_shell_executable to /bin/sh 11389 1726854865.82111: variable 'ansible_shell_executable' from source: unknown 11389 1726854865.82115: variable 'ansible_connection' from source: unknown 11389 1726854865.82117: variable 'ansible_module_compression' from source: unknown 11389 1726854865.82120: variable 'ansible_shell_type' from source: unknown 11389 1726854865.82122: variable 'ansible_shell_executable' from source: unknown 11389 1726854865.82125: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854865.82127: variable 'ansible_pipelining' from source: unknown 11389 1726854865.82130: variable 'ansible_timeout' from source: unknown 11389 1726854865.82134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854865.82254: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854865.82262: variable 'omit' from source: magic vars 11389 1726854865.82266: starting attempt loop 11389 1726854865.82273: running the handler 11389 1726854865.82282: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854865.82305: _low_level_execute_command(): starting 11389 1726854865.82308: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854865.82791: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854865.82805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854865.82828: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 11389 1726854865.82832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854865.82876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854865.82905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854865.83021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854865.84692: stdout chunk (state=3): >>>/root <<< 11389 1726854865.84817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854865.84847: stderr chunk (state=3): >>><<< 11389 1726854865.84850: stdout chunk (state=3): >>><<< 11389 1726854865.84873: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854865.84882: _low_level_execute_command(): starting 11389 1726854865.84891: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854865.8486936-12331-169659292125577 `" && echo ansible-tmp-1726854865.8486936-12331-169659292125577="` echo /root/.ansible/tmp/ansible-tmp-1726854865.8486936-12331-169659292125577 `" ) && sleep 0' 11389 1726854865.85347: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854865.85359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854865.85362: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854865.85364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854865.85410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854865.85413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854865.85415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854865.85478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854865.87408: stdout chunk (state=3): >>>ansible-tmp-1726854865.8486936-12331-169659292125577=/root/.ansible/tmp/ansible-tmp-1726854865.8486936-12331-169659292125577 <<< 11389 1726854865.87538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854865.87565: stderr chunk (state=3): >>><<< 11389 1726854865.87571: stdout chunk (state=3): >>><<< 11389 1726854865.87586: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854865.8486936-12331-169659292125577=/root/.ansible/tmp/ansible-tmp-1726854865.8486936-12331-169659292125577 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854865.87618: variable 'ansible_module_compression' from source: unknown 11389 1726854865.87659: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11389 1726854865.87696: variable 'ansible_facts' from source: unknown 11389 1726854865.87748: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854865.8486936-12331-169659292125577/AnsiballZ_command.py 11389 1726854865.87852: Sending initial data 11389 1726854865.87856: Sent initial data (156 bytes) 11389 1726854865.88313: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854865.88316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854865.88319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854865.88321: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854865.88323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854865.88377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854865.88381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854865.88440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854865.89980: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11389 1726854865.89985: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854865.90098: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854865.90155: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmpx2k3kmbx /root/.ansible/tmp/ansible-tmp-1726854865.8486936-12331-169659292125577/AnsiballZ_command.py <<< 11389 1726854865.90161: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854865.8486936-12331-169659292125577/AnsiballZ_command.py" <<< 11389 1726854865.90213: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmpx2k3kmbx" to remote "/root/.ansible/tmp/ansible-tmp-1726854865.8486936-12331-169659292125577/AnsiballZ_command.py" <<< 11389 1726854865.90218: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854865.8486936-12331-169659292125577/AnsiballZ_command.py" <<< 11389 1726854865.90818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854865.90860: stderr chunk (state=3): >>><<< 11389 1726854865.90863: stdout chunk (state=3): >>><<< 11389 1726854865.90908: done transferring module to remote 11389 1726854865.90917: _low_level_execute_command(): starting 11389 1726854865.90922: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854865.8486936-12331-169659292125577/ /root/.ansible/tmp/ansible-tmp-1726854865.8486936-12331-169659292125577/AnsiballZ_command.py && sleep 0' 11389 1726854865.91368: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854865.91372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854865.91378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11389 1726854865.91380: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854865.91382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854865.91429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854865.91432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854865.91503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854865.93268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854865.93291: stderr chunk (state=3): >>><<< 11389 1726854865.93295: stdout chunk (state=3): >>><<< 11389 1726854865.93311: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854865.93314: _low_level_execute_command(): starting 11389 1726854865.93316: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854865.8486936-12331-169659292125577/AnsiballZ_command.py && sleep 0' 11389 1726854865.93754: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854865.93757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854865.93760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854865.93762: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854865.93764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854865.93815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854865.93818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854865.93914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854866.11209: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 13:54:26.090508", "end": "2024-09-20 13:54:26.111090", "delta": "0:00:00.020582", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11389 1726854866.12995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854866.12999: stdout chunk (state=3): >>><<< 11389 1726854866.13001: stderr chunk (state=3): >>><<< 11389 1726854866.13004: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 13:54:26.090508", "end": "2024-09-20 13:54:26.111090", "delta": "0:00:00.020582", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854866.13008: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854865.8486936-12331-169659292125577/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854866.13015: _low_level_execute_command(): starting 11389 1726854866.13018: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854865.8486936-12331-169659292125577/ > /dev/null 2>&1 && sleep 0' 11389 1726854866.13590: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854866.13601: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854866.13679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854866.13711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854866.13724: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854866.13741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854866.13823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854866.15711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854866.15715: stdout chunk (state=3): >>><<< 11389 1726854866.15717: stderr chunk (state=3): >>><<< 11389 1726854866.15821: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854866.15824: handler run complete 11389 1726854866.15827: Evaluated conditional (False): False 11389 1726854866.15829: attempt loop complete, returning result 11389 1726854866.15831: _execute() done 11389 1726854866.15833: dumping result to json 11389 1726854866.15835: done dumping result, returning 11389 1726854866.15837: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affcc66-ac2b-deb8-c119-0000000003fb] 11389 1726854866.15839: sending task result for task 0affcc66-ac2b-deb8-c119-0000000003fb 11389 1726854866.15905: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000003fb 11389 1726854866.15907: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.020582", "end": "2024-09-20 13:54:26.111090", "rc": 0, "start": "2024-09-20 13:54:26.090508" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 11389 1726854866.15992: no more pending results, returning what we have 11389 1726854866.15996: results queue empty 11389 1726854866.15997: checking for any_errors_fatal 11389 1726854866.16002: done checking for any_errors_fatal 11389 1726854866.16003: checking for max_fail_percentage 11389 1726854866.16005: done checking for max_fail_percentage 11389 1726854866.16005: checking to see if all hosts have failed and the running result is not ok 11389 1726854866.16006: done checking to see if all hosts have failed 11389 1726854866.16007: getting the remaining hosts for this loop 11389 1726854866.16008: done getting the remaining hosts for this loop 11389 1726854866.16011: getting the next task for host managed_node3 11389 1726854866.16017: done getting next task for host managed_node3 11389 1726854866.16019: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11389 1726854866.16023: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854866.16026: getting variables 11389 1726854866.16028: in VariableManager get_vars() 11389 1726854866.16064: Calling all_inventory to load vars for managed_node3 11389 1726854866.16069: Calling groups_inventory to load vars for managed_node3 11389 1726854866.16071: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854866.16080: Calling all_plugins_play to load vars for managed_node3 11389 1726854866.16083: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854866.16085: Calling groups_plugins_play to load vars for managed_node3 11389 1726854866.17643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854866.19231: done with get_vars() 11389 1726854866.19259: done getting variables 11389 1726854866.19324: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:54:26 -0400 (0:00:00.388) 0:00:18.616 ****** 11389 1726854866.19365: entering _queue_task() for managed_node3/set_fact 11389 1726854866.19734: worker is 1 (out of 1 available) 11389 1726854866.19748: exiting _queue_task() for managed_node3/set_fact 11389 1726854866.19759: done queuing things up, now waiting for results queue to drain 11389 1726854866.19761: waiting for pending results... 11389 1726854866.20316: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11389 1726854866.20377: in run() - task 0affcc66-ac2b-deb8-c119-0000000003fc 11389 1726854866.20404: variable 'ansible_search_path' from source: unknown 11389 1726854866.20423: variable 'ansible_search_path' from source: unknown 11389 1726854866.20467: calling self._execute() 11389 1726854866.20572: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.20592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.20610: variable 'omit' from source: magic vars 11389 1726854866.21023: variable 'ansible_distribution_major_version' from source: facts 11389 1726854866.21042: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854866.21291: variable 'nm_profile_exists' from source: set_fact 11389 1726854866.21294: Evaluated conditional (nm_profile_exists.rc == 0): True 11389 1726854866.21297: variable 'omit' from source: magic vars 11389 1726854866.21300: variable 'omit' from source: magic vars 11389 1726854866.21313: variable 'omit' from source: magic vars 11389 1726854866.21359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854866.21411: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854866.21439: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854866.21467: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854866.21489: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854866.21533: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854866.21543: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.21553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.21666: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854866.21681: Set connection var ansible_timeout to 10 11389 1726854866.21692: Set connection var ansible_connection to ssh 11389 1726854866.21703: Set connection var ansible_shell_type to sh 11389 1726854866.21713: Set connection var ansible_pipelining to False 11389 1726854866.21795: Set connection var ansible_shell_executable to /bin/sh 11389 1726854866.21798: variable 'ansible_shell_executable' from source: unknown 11389 1726854866.21800: variable 'ansible_connection' from source: unknown 11389 1726854866.21803: variable 'ansible_module_compression' from source: unknown 11389 1726854866.21805: variable 'ansible_shell_type' from source: unknown 11389 1726854866.21808: variable 'ansible_shell_executable' from source: unknown 11389 1726854866.21810: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.21812: variable 'ansible_pipelining' from source: unknown 11389 1726854866.21814: variable 'ansible_timeout' from source: unknown 11389 1726854866.21817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.21995: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854866.21999: variable 'omit' from source: magic vars 11389 1726854866.22001: starting attempt loop 11389 1726854866.22003: running the handler 11389 1726854866.22006: handler run complete 11389 1726854866.22017: attempt loop complete, returning result 11389 1726854866.22043: _execute() done 11389 1726854866.22047: dumping result to json 11389 1726854866.22049: done dumping result, returning 11389 1726854866.22152: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcc66-ac2b-deb8-c119-0000000003fc] 11389 1726854866.22156: sending task result for task 0affcc66-ac2b-deb8-c119-0000000003fc 11389 1726854866.22225: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000003fc 11389 1726854866.22229: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11389 1726854866.22313: no more pending results, returning what we have 11389 1726854866.22317: results queue empty 11389 1726854866.22318: checking for any_errors_fatal 11389 1726854866.22325: done checking for any_errors_fatal 11389 1726854866.22326: checking for max_fail_percentage 11389 1726854866.22328: done checking for max_fail_percentage 11389 1726854866.22329: checking to see if all hosts have failed and the running result is not ok 11389 1726854866.22331: done checking to see if all hosts have failed 11389 1726854866.22331: getting the remaining hosts for this loop 11389 1726854866.22333: done getting the remaining hosts for this loop 11389 1726854866.22337: getting the next task for host managed_node3 11389 1726854866.22347: done getting next task for host managed_node3 11389 1726854866.22350: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11389 1726854866.22354: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854866.22359: getting variables 11389 1726854866.22360: in VariableManager get_vars() 11389 1726854866.22512: Calling all_inventory to load vars for managed_node3 11389 1726854866.22515: Calling groups_inventory to load vars for managed_node3 11389 1726854866.22518: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854866.22530: Calling all_plugins_play to load vars for managed_node3 11389 1726854866.22533: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854866.22537: Calling groups_plugins_play to load vars for managed_node3 11389 1726854866.24930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854866.28331: done with get_vars() 11389 1726854866.28364: done getting variables 11389 1726854866.28439: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854866.28569: variable 'profile' from source: include params 11389 1726854866.28573: variable 'item' from source: include params 11389 1726854866.28643: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:54:26 -0400 (0:00:00.093) 0:00:18.709 ****** 11389 1726854866.28682: entering _queue_task() for managed_node3/command 11389 1726854866.29045: worker is 1 (out of 1 available) 11389 1726854866.29058: exiting _queue_task() for managed_node3/command 11389 1726854866.29182: done queuing things up, now waiting for results queue to drain 11389 1726854866.29185: waiting for pending results... 11389 1726854866.29375: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 11389 1726854866.29624: in run() - task 0affcc66-ac2b-deb8-c119-0000000003fe 11389 1726854866.29629: variable 'ansible_search_path' from source: unknown 11389 1726854866.29632: variable 'ansible_search_path' from source: unknown 11389 1726854866.29635: calling self._execute() 11389 1726854866.29697: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.29709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.29731: variable 'omit' from source: magic vars 11389 1726854866.30113: variable 'ansible_distribution_major_version' from source: facts 11389 1726854866.30130: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854866.30265: variable 'profile_stat' from source: set_fact 11389 1726854866.30296: Evaluated conditional (profile_stat.stat.exists): False 11389 1726854866.30304: when evaluation is False, skipping this task 11389 1726854866.30312: _execute() done 11389 1726854866.30319: dumping result to json 11389 1726854866.30327: done dumping result, returning 11389 1726854866.30338: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [0affcc66-ac2b-deb8-c119-0000000003fe] 11389 1726854866.30348: sending task result for task 0affcc66-ac2b-deb8-c119-0000000003fe 11389 1726854866.30543: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000003fe 11389 1726854866.30546: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11389 1726854866.30711: no more pending results, returning what we have 11389 1726854866.30715: results queue empty 11389 1726854866.30716: checking for any_errors_fatal 11389 1726854866.30721: done checking for any_errors_fatal 11389 1726854866.30722: checking for max_fail_percentage 11389 1726854866.30724: done checking for max_fail_percentage 11389 1726854866.30724: checking to see if all hosts have failed and the running result is not ok 11389 1726854866.30725: done checking to see if all hosts have failed 11389 1726854866.30726: getting the remaining hosts for this loop 11389 1726854866.30728: done getting the remaining hosts for this loop 11389 1726854866.30731: getting the next task for host managed_node3 11389 1726854866.30737: done getting next task for host managed_node3 11389 1726854866.30739: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11389 1726854866.30743: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854866.30747: getting variables 11389 1726854866.30748: in VariableManager get_vars() 11389 1726854866.30786: Calling all_inventory to load vars for managed_node3 11389 1726854866.30792: Calling groups_inventory to load vars for managed_node3 11389 1726854866.30795: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854866.30806: Calling all_plugins_play to load vars for managed_node3 11389 1726854866.30809: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854866.30812: Calling groups_plugins_play to load vars for managed_node3 11389 1726854866.32377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854866.33965: done with get_vars() 11389 1726854866.33992: done getting variables 11389 1726854866.34059: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854866.34189: variable 'profile' from source: include params 11389 1726854866.34193: variable 'item' from source: include params 11389 1726854866.34256: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:54:26 -0400 (0:00:00.056) 0:00:18.766 ****** 11389 1726854866.34296: entering _queue_task() for managed_node3/set_fact 11389 1726854866.34727: worker is 1 (out of 1 available) 11389 1726854866.34738: exiting _queue_task() for managed_node3/set_fact 11389 1726854866.34750: done queuing things up, now waiting for results queue to drain 11389 1726854866.34752: waiting for pending results... 11389 1726854866.34993: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 11389 1726854866.35112: in run() - task 0affcc66-ac2b-deb8-c119-0000000003ff 11389 1726854866.35133: variable 'ansible_search_path' from source: unknown 11389 1726854866.35141: variable 'ansible_search_path' from source: unknown 11389 1726854866.35182: calling self._execute() 11389 1726854866.35304: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.35308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.35314: variable 'omit' from source: magic vars 11389 1726854866.35741: variable 'ansible_distribution_major_version' from source: facts 11389 1726854866.35745: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854866.35847: variable 'profile_stat' from source: set_fact 11389 1726854866.35871: Evaluated conditional (profile_stat.stat.exists): False 11389 1726854866.35879: when evaluation is False, skipping this task 11389 1726854866.35890: _execute() done 11389 1726854866.35898: dumping result to json 11389 1726854866.35907: done dumping result, returning 11389 1726854866.35960: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [0affcc66-ac2b-deb8-c119-0000000003ff] 11389 1726854866.35963: sending task result for task 0affcc66-ac2b-deb8-c119-0000000003ff 11389 1726854866.36034: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000003ff skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11389 1726854866.36113: no more pending results, returning what we have 11389 1726854866.36117: results queue empty 11389 1726854866.36118: checking for any_errors_fatal 11389 1726854866.36125: done checking for any_errors_fatal 11389 1726854866.36126: checking for max_fail_percentage 11389 1726854866.36128: done checking for max_fail_percentage 11389 1726854866.36129: checking to see if all hosts have failed and the running result is not ok 11389 1726854866.36130: done checking to see if all hosts have failed 11389 1726854866.36131: getting the remaining hosts for this loop 11389 1726854866.36133: done getting the remaining hosts for this loop 11389 1726854866.36136: getting the next task for host managed_node3 11389 1726854866.36144: done getting next task for host managed_node3 11389 1726854866.36147: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11389 1726854866.36152: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854866.36157: getting variables 11389 1726854866.36159: in VariableManager get_vars() 11389 1726854866.36206: Calling all_inventory to load vars for managed_node3 11389 1726854866.36209: Calling groups_inventory to load vars for managed_node3 11389 1726854866.36212: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854866.36227: Calling all_plugins_play to load vars for managed_node3 11389 1726854866.36230: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854866.36233: Calling groups_plugins_play to load vars for managed_node3 11389 1726854866.36903: WORKER PROCESS EXITING 11389 1726854866.37450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854866.38410: done with get_vars() 11389 1726854866.38424: done getting variables 11389 1726854866.38468: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854866.38547: variable 'profile' from source: include params 11389 1726854866.38550: variable 'item' from source: include params 11389 1726854866.38593: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:54:26 -0400 (0:00:00.043) 0:00:18.809 ****** 11389 1726854866.38626: entering _queue_task() for managed_node3/command 11389 1726854866.39141: worker is 1 (out of 1 available) 11389 1726854866.39152: exiting _queue_task() for managed_node3/command 11389 1726854866.39161: done queuing things up, now waiting for results queue to drain 11389 1726854866.39162: waiting for pending results... 11389 1726854866.39294: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 11389 1726854866.39376: in run() - task 0affcc66-ac2b-deb8-c119-000000000400 11389 1726854866.39403: variable 'ansible_search_path' from source: unknown 11389 1726854866.39411: variable 'ansible_search_path' from source: unknown 11389 1726854866.39449: calling self._execute() 11389 1726854866.39552: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.39604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.39655: variable 'omit' from source: magic vars 11389 1726854866.40047: variable 'ansible_distribution_major_version' from source: facts 11389 1726854866.40071: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854866.40151: variable 'profile_stat' from source: set_fact 11389 1726854866.40162: Evaluated conditional (profile_stat.stat.exists): False 11389 1726854866.40169: when evaluation is False, skipping this task 11389 1726854866.40172: _execute() done 11389 1726854866.40177: dumping result to json 11389 1726854866.40180: done dumping result, returning 11389 1726854866.40186: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 [0affcc66-ac2b-deb8-c119-000000000400] 11389 1726854866.40192: sending task result for task 0affcc66-ac2b-deb8-c119-000000000400 11389 1726854866.40273: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000400 11389 1726854866.40277: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11389 1726854866.40328: no more pending results, returning what we have 11389 1726854866.40332: results queue empty 11389 1726854866.40332: checking for any_errors_fatal 11389 1726854866.40339: done checking for any_errors_fatal 11389 1726854866.40340: checking for max_fail_percentage 11389 1726854866.40342: done checking for max_fail_percentage 11389 1726854866.40343: checking to see if all hosts have failed and the running result is not ok 11389 1726854866.40344: done checking to see if all hosts have failed 11389 1726854866.40344: getting the remaining hosts for this loop 11389 1726854866.40345: done getting the remaining hosts for this loop 11389 1726854866.40348: getting the next task for host managed_node3 11389 1726854866.40355: done getting next task for host managed_node3 11389 1726854866.40358: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11389 1726854866.40362: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854866.40369: getting variables 11389 1726854866.40370: in VariableManager get_vars() 11389 1726854866.40411: Calling all_inventory to load vars for managed_node3 11389 1726854866.40413: Calling groups_inventory to load vars for managed_node3 11389 1726854866.40415: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854866.40425: Calling all_plugins_play to load vars for managed_node3 11389 1726854866.40427: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854866.40429: Calling groups_plugins_play to load vars for managed_node3 11389 1726854866.41198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854866.42223: done with get_vars() 11389 1726854866.42242: done getting variables 11389 1726854866.42309: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854866.42421: variable 'profile' from source: include params 11389 1726854866.42425: variable 'item' from source: include params 11389 1726854866.42486: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:54:26 -0400 (0:00:00.038) 0:00:18.848 ****** 11389 1726854866.42519: entering _queue_task() for managed_node3/set_fact 11389 1726854866.42841: worker is 1 (out of 1 available) 11389 1726854866.42852: exiting _queue_task() for managed_node3/set_fact 11389 1726854866.42863: done queuing things up, now waiting for results queue to drain 11389 1726854866.42865: waiting for pending results... 11389 1726854866.43272: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 11389 1726854866.43282: in run() - task 0affcc66-ac2b-deb8-c119-000000000401 11389 1726854866.43334: variable 'ansible_search_path' from source: unknown 11389 1726854866.43338: variable 'ansible_search_path' from source: unknown 11389 1726854866.43342: calling self._execute() 11389 1726854866.43422: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.43426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.43436: variable 'omit' from source: magic vars 11389 1726854866.43717: variable 'ansible_distribution_major_version' from source: facts 11389 1726854866.43727: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854866.43813: variable 'profile_stat' from source: set_fact 11389 1726854866.43824: Evaluated conditional (profile_stat.stat.exists): False 11389 1726854866.43827: when evaluation is False, skipping this task 11389 1726854866.43831: _execute() done 11389 1726854866.43834: dumping result to json 11389 1726854866.43837: done dumping result, returning 11389 1726854866.43842: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [0affcc66-ac2b-deb8-c119-000000000401] 11389 1726854866.43852: sending task result for task 0affcc66-ac2b-deb8-c119-000000000401 11389 1726854866.43933: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000401 11389 1726854866.43936: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11389 1726854866.43997: no more pending results, returning what we have 11389 1726854866.44000: results queue empty 11389 1726854866.44001: checking for any_errors_fatal 11389 1726854866.44006: done checking for any_errors_fatal 11389 1726854866.44006: checking for max_fail_percentage 11389 1726854866.44009: done checking for max_fail_percentage 11389 1726854866.44010: checking to see if all hosts have failed and the running result is not ok 11389 1726854866.44011: done checking to see if all hosts have failed 11389 1726854866.44011: getting the remaining hosts for this loop 11389 1726854866.44012: done getting the remaining hosts for this loop 11389 1726854866.44016: getting the next task for host managed_node3 11389 1726854866.44024: done getting next task for host managed_node3 11389 1726854866.44027: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11389 1726854866.44031: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854866.44035: getting variables 11389 1726854866.44037: in VariableManager get_vars() 11389 1726854866.44074: Calling all_inventory to load vars for managed_node3 11389 1726854866.44077: Calling groups_inventory to load vars for managed_node3 11389 1726854866.44079: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854866.44090: Calling all_plugins_play to load vars for managed_node3 11389 1726854866.44092: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854866.44095: Calling groups_plugins_play to load vars for managed_node3 11389 1726854866.45083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854866.46442: done with get_vars() 11389 1726854866.46462: done getting variables 11389 1726854866.46527: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854866.46641: variable 'profile' from source: include params 11389 1726854866.46645: variable 'item' from source: include params 11389 1726854866.46705: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:54:26 -0400 (0:00:00.042) 0:00:18.890 ****** 11389 1726854866.46735: entering _queue_task() for managed_node3/assert 11389 1726854866.47055: worker is 1 (out of 1 available) 11389 1726854866.47069: exiting _queue_task() for managed_node3/assert 11389 1726854866.47080: done queuing things up, now waiting for results queue to drain 11389 1726854866.47081: waiting for pending results... 11389 1726854866.47318: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' 11389 1726854866.47383: in run() - task 0affcc66-ac2b-deb8-c119-000000000267 11389 1726854866.47398: variable 'ansible_search_path' from source: unknown 11389 1726854866.47402: variable 'ansible_search_path' from source: unknown 11389 1726854866.47430: calling self._execute() 11389 1726854866.47502: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.47505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.47521: variable 'omit' from source: magic vars 11389 1726854866.47775: variable 'ansible_distribution_major_version' from source: facts 11389 1726854866.47784: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854866.47792: variable 'omit' from source: magic vars 11389 1726854866.47817: variable 'omit' from source: magic vars 11389 1726854866.47891: variable 'profile' from source: include params 11389 1726854866.47895: variable 'item' from source: include params 11389 1726854866.47938: variable 'item' from source: include params 11389 1726854866.47961: variable 'omit' from source: magic vars 11389 1726854866.47989: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854866.48016: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854866.48032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854866.48046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854866.48055: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854866.48081: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854866.48085: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.48089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.48155: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854866.48162: Set connection var ansible_timeout to 10 11389 1726854866.48164: Set connection var ansible_connection to ssh 11389 1726854866.48171: Set connection var ansible_shell_type to sh 11389 1726854866.48173: Set connection var ansible_pipelining to False 11389 1726854866.48184: Set connection var ansible_shell_executable to /bin/sh 11389 1726854866.48198: variable 'ansible_shell_executable' from source: unknown 11389 1726854866.48201: variable 'ansible_connection' from source: unknown 11389 1726854866.48203: variable 'ansible_module_compression' from source: unknown 11389 1726854866.48205: variable 'ansible_shell_type' from source: unknown 11389 1726854866.48208: variable 'ansible_shell_executable' from source: unknown 11389 1726854866.48210: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.48214: variable 'ansible_pipelining' from source: unknown 11389 1726854866.48217: variable 'ansible_timeout' from source: unknown 11389 1726854866.48222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.48320: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854866.48329: variable 'omit' from source: magic vars 11389 1726854866.48334: starting attempt loop 11389 1726854866.48337: running the handler 11389 1726854866.48413: variable 'lsr_net_profile_exists' from source: set_fact 11389 1726854866.48416: Evaluated conditional (lsr_net_profile_exists): True 11389 1726854866.48422: handler run complete 11389 1726854866.48433: attempt loop complete, returning result 11389 1726854866.48436: _execute() done 11389 1726854866.48439: dumping result to json 11389 1726854866.48441: done dumping result, returning 11389 1726854866.48447: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' [0affcc66-ac2b-deb8-c119-000000000267] 11389 1726854866.48452: sending task result for task 0affcc66-ac2b-deb8-c119-000000000267 11389 1726854866.48531: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000267 11389 1726854866.48534: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11389 1726854866.48580: no more pending results, returning what we have 11389 1726854866.48583: results queue empty 11389 1726854866.48583: checking for any_errors_fatal 11389 1726854866.48591: done checking for any_errors_fatal 11389 1726854866.48592: checking for max_fail_percentage 11389 1726854866.48594: done checking for max_fail_percentage 11389 1726854866.48595: checking to see if all hosts have failed and the running result is not ok 11389 1726854866.48596: done checking to see if all hosts have failed 11389 1726854866.48596: getting the remaining hosts for this loop 11389 1726854866.48597: done getting the remaining hosts for this loop 11389 1726854866.48600: getting the next task for host managed_node3 11389 1726854866.48606: done getting next task for host managed_node3 11389 1726854866.48608: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11389 1726854866.48611: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854866.48615: getting variables 11389 1726854866.48616: in VariableManager get_vars() 11389 1726854866.48654: Calling all_inventory to load vars for managed_node3 11389 1726854866.48657: Calling groups_inventory to load vars for managed_node3 11389 1726854866.48659: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854866.48672: Calling all_plugins_play to load vars for managed_node3 11389 1726854866.48674: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854866.48677: Calling groups_plugins_play to load vars for managed_node3 11389 1726854866.49943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854866.50818: done with get_vars() 11389 1726854866.50834: done getting variables 11389 1726854866.50877: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854866.50956: variable 'profile' from source: include params 11389 1726854866.50959: variable 'item' from source: include params 11389 1726854866.51002: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:54:26 -0400 (0:00:00.042) 0:00:18.933 ****** 11389 1726854866.51027: entering _queue_task() for managed_node3/assert 11389 1726854866.51337: worker is 1 (out of 1 available) 11389 1726854866.51351: exiting _queue_task() for managed_node3/assert 11389 1726854866.51361: done queuing things up, now waiting for results queue to drain 11389 1726854866.51363: waiting for pending results... 11389 1726854866.51527: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' 11389 1726854866.51606: in run() - task 0affcc66-ac2b-deb8-c119-000000000268 11389 1726854866.51618: variable 'ansible_search_path' from source: unknown 11389 1726854866.51621: variable 'ansible_search_path' from source: unknown 11389 1726854866.51648: calling self._execute() 11389 1726854866.51719: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.51723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.51731: variable 'omit' from source: magic vars 11389 1726854866.51985: variable 'ansible_distribution_major_version' from source: facts 11389 1726854866.51995: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854866.52002: variable 'omit' from source: magic vars 11389 1726854866.52032: variable 'omit' from source: magic vars 11389 1726854866.52098: variable 'profile' from source: include params 11389 1726854866.52102: variable 'item' from source: include params 11389 1726854866.52147: variable 'item' from source: include params 11389 1726854866.52161: variable 'omit' from source: magic vars 11389 1726854866.52194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854866.52219: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854866.52235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854866.52252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854866.52263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854866.52289: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854866.52293: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.52295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.52362: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854866.52471: Set connection var ansible_timeout to 10 11389 1726854866.52475: Set connection var ansible_connection to ssh 11389 1726854866.52477: Set connection var ansible_shell_type to sh 11389 1726854866.52480: Set connection var ansible_pipelining to False 11389 1726854866.52483: Set connection var ansible_shell_executable to /bin/sh 11389 1726854866.52485: variable 'ansible_shell_executable' from source: unknown 11389 1726854866.52491: variable 'ansible_connection' from source: unknown 11389 1726854866.52494: variable 'ansible_module_compression' from source: unknown 11389 1726854866.52496: variable 'ansible_shell_type' from source: unknown 11389 1726854866.52498: variable 'ansible_shell_executable' from source: unknown 11389 1726854866.52500: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.52502: variable 'ansible_pipelining' from source: unknown 11389 1726854866.52505: variable 'ansible_timeout' from source: unknown 11389 1726854866.52507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.52518: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854866.52528: variable 'omit' from source: magic vars 11389 1726854866.52533: starting attempt loop 11389 1726854866.52536: running the handler 11389 1726854866.52612: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11389 1726854866.52615: Evaluated conditional (lsr_net_profile_ansible_managed): True 11389 1726854866.52621: handler run complete 11389 1726854866.52632: attempt loop complete, returning result 11389 1726854866.52635: _execute() done 11389 1726854866.52637: dumping result to json 11389 1726854866.52640: done dumping result, returning 11389 1726854866.52646: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' [0affcc66-ac2b-deb8-c119-000000000268] 11389 1726854866.52652: sending task result for task 0affcc66-ac2b-deb8-c119-000000000268 11389 1726854866.52728: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000268 11389 1726854866.52730: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11389 1726854866.52776: no more pending results, returning what we have 11389 1726854866.52779: results queue empty 11389 1726854866.52780: checking for any_errors_fatal 11389 1726854866.52784: done checking for any_errors_fatal 11389 1726854866.52785: checking for max_fail_percentage 11389 1726854866.52788: done checking for max_fail_percentage 11389 1726854866.52789: checking to see if all hosts have failed and the running result is not ok 11389 1726854866.52790: done checking to see if all hosts have failed 11389 1726854866.52791: getting the remaining hosts for this loop 11389 1726854866.52792: done getting the remaining hosts for this loop 11389 1726854866.52795: getting the next task for host managed_node3 11389 1726854866.52802: done getting next task for host managed_node3 11389 1726854866.52804: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11389 1726854866.52807: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854866.52812: getting variables 11389 1726854866.52813: in VariableManager get_vars() 11389 1726854866.52850: Calling all_inventory to load vars for managed_node3 11389 1726854866.52852: Calling groups_inventory to load vars for managed_node3 11389 1726854866.52855: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854866.52864: Calling all_plugins_play to load vars for managed_node3 11389 1726854866.52869: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854866.52871: Calling groups_plugins_play to load vars for managed_node3 11389 1726854866.54411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854866.55450: done with get_vars() 11389 1726854866.55470: done getting variables 11389 1726854866.55519: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854866.55602: variable 'profile' from source: include params 11389 1726854866.55605: variable 'item' from source: include params 11389 1726854866.55647: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:54:26 -0400 (0:00:00.046) 0:00:18.979 ****** 11389 1726854866.55676: entering _queue_task() for managed_node3/assert 11389 1726854866.55921: worker is 1 (out of 1 available) 11389 1726854866.55933: exiting _queue_task() for managed_node3/assert 11389 1726854866.55944: done queuing things up, now waiting for results queue to drain 11389 1726854866.55945: waiting for pending results... 11389 1726854866.56122: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 11389 1726854866.56190: in run() - task 0affcc66-ac2b-deb8-c119-000000000269 11389 1726854866.56202: variable 'ansible_search_path' from source: unknown 11389 1726854866.56205: variable 'ansible_search_path' from source: unknown 11389 1726854866.56233: calling self._execute() 11389 1726854866.56312: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.56316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.56325: variable 'omit' from source: magic vars 11389 1726854866.56582: variable 'ansible_distribution_major_version' from source: facts 11389 1726854866.56593: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854866.56600: variable 'omit' from source: magic vars 11389 1726854866.56631: variable 'omit' from source: magic vars 11389 1726854866.56701: variable 'profile' from source: include params 11389 1726854866.56705: variable 'item' from source: include params 11389 1726854866.56750: variable 'item' from source: include params 11389 1726854866.56764: variable 'omit' from source: magic vars 11389 1726854866.56798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854866.56857: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854866.56860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854866.56886: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854866.56935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854866.56938: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854866.56940: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.56943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.57046: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854866.57054: Set connection var ansible_timeout to 10 11389 1726854866.57057: Set connection var ansible_connection to ssh 11389 1726854866.57192: Set connection var ansible_shell_type to sh 11389 1726854866.57195: Set connection var ansible_pipelining to False 11389 1726854866.57198: Set connection var ansible_shell_executable to /bin/sh 11389 1726854866.57200: variable 'ansible_shell_executable' from source: unknown 11389 1726854866.57203: variable 'ansible_connection' from source: unknown 11389 1726854866.57205: variable 'ansible_module_compression' from source: unknown 11389 1726854866.57207: variable 'ansible_shell_type' from source: unknown 11389 1726854866.57209: variable 'ansible_shell_executable' from source: unknown 11389 1726854866.57211: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.57213: variable 'ansible_pipelining' from source: unknown 11389 1726854866.57215: variable 'ansible_timeout' from source: unknown 11389 1726854866.57218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.57236: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854866.57339: variable 'omit' from source: magic vars 11389 1726854866.57342: starting attempt loop 11389 1726854866.57345: running the handler 11389 1726854866.57357: variable 'lsr_net_profile_fingerprint' from source: set_fact 11389 1726854866.57363: Evaluated conditional (lsr_net_profile_fingerprint): True 11389 1726854866.57370: handler run complete 11389 1726854866.57383: attempt loop complete, returning result 11389 1726854866.57385: _execute() done 11389 1726854866.57390: dumping result to json 11389 1726854866.57393: done dumping result, returning 11389 1726854866.57401: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 [0affcc66-ac2b-deb8-c119-000000000269] 11389 1726854866.57407: sending task result for task 0affcc66-ac2b-deb8-c119-000000000269 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11389 1726854866.57601: no more pending results, returning what we have 11389 1726854866.57605: results queue empty 11389 1726854866.57606: checking for any_errors_fatal 11389 1726854866.57612: done checking for any_errors_fatal 11389 1726854866.57613: checking for max_fail_percentage 11389 1726854866.57614: done checking for max_fail_percentage 11389 1726854866.57615: checking to see if all hosts have failed and the running result is not ok 11389 1726854866.57616: done checking to see if all hosts have failed 11389 1726854866.57617: getting the remaining hosts for this loop 11389 1726854866.57618: done getting the remaining hosts for this loop 11389 1726854866.57621: getting the next task for host managed_node3 11389 1726854866.57629: done getting next task for host managed_node3 11389 1726854866.57632: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11389 1726854866.57634: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854866.57637: getting variables 11389 1726854866.57639: in VariableManager get_vars() 11389 1726854866.57762: Calling all_inventory to load vars for managed_node3 11389 1726854866.57765: Calling groups_inventory to load vars for managed_node3 11389 1726854866.57767: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854866.57777: Calling all_plugins_play to load vars for managed_node3 11389 1726854866.57780: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854866.57784: Calling groups_plugins_play to load vars for managed_node3 11389 1726854866.58308: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000269 11389 1726854866.58315: WORKER PROCESS EXITING 11389 1726854866.59162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854866.60818: done with get_vars() 11389 1726854866.60849: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:54:26 -0400 (0:00:00.052) 0:00:19.032 ****** 11389 1726854866.60968: entering _queue_task() for managed_node3/include_tasks 11389 1726854866.61325: worker is 1 (out of 1 available) 11389 1726854866.61336: exiting _queue_task() for managed_node3/include_tasks 11389 1726854866.61349: done queuing things up, now waiting for results queue to drain 11389 1726854866.61350: waiting for pending results... 11389 1726854866.61716: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 11389 1726854866.61755: in run() - task 0affcc66-ac2b-deb8-c119-00000000026d 11389 1726854866.61780: variable 'ansible_search_path' from source: unknown 11389 1726854866.61791: variable 'ansible_search_path' from source: unknown 11389 1726854866.61840: calling self._execute() 11389 1726854866.61950: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.61963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.61980: variable 'omit' from source: magic vars 11389 1726854866.62398: variable 'ansible_distribution_major_version' from source: facts 11389 1726854866.62416: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854866.62428: _execute() done 11389 1726854866.62475: dumping result to json 11389 1726854866.62479: done dumping result, returning 11389 1726854866.62481: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0affcc66-ac2b-deb8-c119-00000000026d] 11389 1726854866.62484: sending task result for task 0affcc66-ac2b-deb8-c119-00000000026d 11389 1726854866.62608: no more pending results, returning what we have 11389 1726854866.62614: in VariableManager get_vars() 11389 1726854866.62670: Calling all_inventory to load vars for managed_node3 11389 1726854866.62673: Calling groups_inventory to load vars for managed_node3 11389 1726854866.62676: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854866.62698: Calling all_plugins_play to load vars for managed_node3 11389 1726854866.62702: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854866.62706: Calling groups_plugins_play to load vars for managed_node3 11389 1726854866.63677: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000026d 11389 1726854866.63681: WORKER PROCESS EXITING 11389 1726854866.64610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854866.65672: done with get_vars() 11389 1726854866.65694: variable 'ansible_search_path' from source: unknown 11389 1726854866.65696: variable 'ansible_search_path' from source: unknown 11389 1726854866.65725: we have included files to process 11389 1726854866.65726: generating all_blocks data 11389 1726854866.65728: done generating all_blocks data 11389 1726854866.65732: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11389 1726854866.65733: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11389 1726854866.65735: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11389 1726854866.66345: done processing included file 11389 1726854866.66346: iterating over new_blocks loaded from include file 11389 1726854866.66347: in VariableManager get_vars() 11389 1726854866.66362: done with get_vars() 11389 1726854866.66363: filtering new block on tags 11389 1726854866.66382: done filtering new block on tags 11389 1726854866.66384: in VariableManager get_vars() 11389 1726854866.66399: done with get_vars() 11389 1726854866.66401: filtering new block on tags 11389 1726854866.66414: done filtering new block on tags 11389 1726854866.66415: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 11389 1726854866.66419: extending task lists for all hosts with included blocks 11389 1726854866.66525: done extending task lists 11389 1726854866.66526: done processing included files 11389 1726854866.66526: results queue empty 11389 1726854866.66527: checking for any_errors_fatal 11389 1726854866.66529: done checking for any_errors_fatal 11389 1726854866.66530: checking for max_fail_percentage 11389 1726854866.66530: done checking for max_fail_percentage 11389 1726854866.66531: checking to see if all hosts have failed and the running result is not ok 11389 1726854866.66532: done checking to see if all hosts have failed 11389 1726854866.66532: getting the remaining hosts for this loop 11389 1726854866.66533: done getting the remaining hosts for this loop 11389 1726854866.66534: getting the next task for host managed_node3 11389 1726854866.66537: done getting next task for host managed_node3 11389 1726854866.66539: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11389 1726854866.66541: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854866.66542: getting variables 11389 1726854866.66543: in VariableManager get_vars() 11389 1726854866.66552: Calling all_inventory to load vars for managed_node3 11389 1726854866.66554: Calling groups_inventory to load vars for managed_node3 11389 1726854866.66555: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854866.66559: Calling all_plugins_play to load vars for managed_node3 11389 1726854866.66560: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854866.66562: Calling groups_plugins_play to load vars for managed_node3 11389 1726854866.67598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854866.68619: done with get_vars() 11389 1726854866.68635: done getting variables 11389 1726854866.68668: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:54:26 -0400 (0:00:00.077) 0:00:19.109 ****** 11389 1726854866.68695: entering _queue_task() for managed_node3/set_fact 11389 1726854866.68950: worker is 1 (out of 1 available) 11389 1726854866.68963: exiting _queue_task() for managed_node3/set_fact 11389 1726854866.68976: done queuing things up, now waiting for results queue to drain 11389 1726854866.68978: waiting for pending results... 11389 1726854866.69161: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 11389 1726854866.69247: in run() - task 0affcc66-ac2b-deb8-c119-000000000440 11389 1726854866.69259: variable 'ansible_search_path' from source: unknown 11389 1726854866.69262: variable 'ansible_search_path' from source: unknown 11389 1726854866.69295: calling self._execute() 11389 1726854866.69367: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.69374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.69383: variable 'omit' from source: magic vars 11389 1726854866.69659: variable 'ansible_distribution_major_version' from source: facts 11389 1726854866.69672: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854866.69678: variable 'omit' from source: magic vars 11389 1726854866.69710: variable 'omit' from source: magic vars 11389 1726854866.69734: variable 'omit' from source: magic vars 11389 1726854866.69766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854866.69797: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854866.69814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854866.69826: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854866.69837: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854866.69863: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854866.69867: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.69870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.69940: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854866.69946: Set connection var ansible_timeout to 10 11389 1726854866.69949: Set connection var ansible_connection to ssh 11389 1726854866.69953: Set connection var ansible_shell_type to sh 11389 1726854866.69960: Set connection var ansible_pipelining to False 11389 1726854866.69963: Set connection var ansible_shell_executable to /bin/sh 11389 1726854866.69989: variable 'ansible_shell_executable' from source: unknown 11389 1726854866.69992: variable 'ansible_connection' from source: unknown 11389 1726854866.69996: variable 'ansible_module_compression' from source: unknown 11389 1726854866.69998: variable 'ansible_shell_type' from source: unknown 11389 1726854866.70000: variable 'ansible_shell_executable' from source: unknown 11389 1726854866.70003: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.70005: variable 'ansible_pipelining' from source: unknown 11389 1726854866.70007: variable 'ansible_timeout' from source: unknown 11389 1726854866.70009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.70124: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854866.70132: variable 'omit' from source: magic vars 11389 1726854866.70137: starting attempt loop 11389 1726854866.70140: running the handler 11389 1726854866.70150: handler run complete 11389 1726854866.70158: attempt loop complete, returning result 11389 1726854866.70161: _execute() done 11389 1726854866.70163: dumping result to json 11389 1726854866.70165: done dumping result, returning 11389 1726854866.70175: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcc66-ac2b-deb8-c119-000000000440] 11389 1726854866.70181: sending task result for task 0affcc66-ac2b-deb8-c119-000000000440 11389 1726854866.70284: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000440 11389 1726854866.70286: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11389 1726854866.70556: no more pending results, returning what we have 11389 1726854866.70560: results queue empty 11389 1726854866.70561: checking for any_errors_fatal 11389 1726854866.70562: done checking for any_errors_fatal 11389 1726854866.70563: checking for max_fail_percentage 11389 1726854866.70565: done checking for max_fail_percentage 11389 1726854866.70565: checking to see if all hosts have failed and the running result is not ok 11389 1726854866.70566: done checking to see if all hosts have failed 11389 1726854866.70567: getting the remaining hosts for this loop 11389 1726854866.70568: done getting the remaining hosts for this loop 11389 1726854866.70572: getting the next task for host managed_node3 11389 1726854866.70578: done getting next task for host managed_node3 11389 1726854866.70580: ^ task is: TASK: Stat profile file 11389 1726854866.70583: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854866.70589: getting variables 11389 1726854866.70591: in VariableManager get_vars() 11389 1726854866.70625: Calling all_inventory to load vars for managed_node3 11389 1726854866.70628: Calling groups_inventory to load vars for managed_node3 11389 1726854866.70630: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854866.70640: Calling all_plugins_play to load vars for managed_node3 11389 1726854866.70642: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854866.70645: Calling groups_plugins_play to load vars for managed_node3 11389 1726854866.72025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854866.73689: done with get_vars() 11389 1726854866.73714: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:54:26 -0400 (0:00:00.051) 0:00:19.161 ****** 11389 1726854866.73814: entering _queue_task() for managed_node3/stat 11389 1726854866.74172: worker is 1 (out of 1 available) 11389 1726854866.74399: exiting _queue_task() for managed_node3/stat 11389 1726854866.74409: done queuing things up, now waiting for results queue to drain 11389 1726854866.74411: waiting for pending results... 11389 1726854866.74607: running TaskExecutor() for managed_node3/TASK: Stat profile file 11389 1726854866.74612: in run() - task 0affcc66-ac2b-deb8-c119-000000000441 11389 1726854866.74642: variable 'ansible_search_path' from source: unknown 11389 1726854866.74650: variable 'ansible_search_path' from source: unknown 11389 1726854866.74692: calling self._execute() 11389 1726854866.74798: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.74856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.74860: variable 'omit' from source: magic vars 11389 1726854866.75229: variable 'ansible_distribution_major_version' from source: facts 11389 1726854866.75250: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854866.75261: variable 'omit' from source: magic vars 11389 1726854866.75320: variable 'omit' from source: magic vars 11389 1726854866.75430: variable 'profile' from source: include params 11389 1726854866.75462: variable 'item' from source: include params 11389 1726854866.75523: variable 'item' from source: include params 11389 1726854866.75545: variable 'omit' from source: magic vars 11389 1726854866.75593: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854866.75679: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854866.75684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854866.75686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854866.75705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854866.75745: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854866.75755: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.75764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.75884: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854866.75940: Set connection var ansible_timeout to 10 11389 1726854866.75943: Set connection var ansible_connection to ssh 11389 1726854866.75946: Set connection var ansible_shell_type to sh 11389 1726854866.75948: Set connection var ansible_pipelining to False 11389 1726854866.75951: Set connection var ansible_shell_executable to /bin/sh 11389 1726854866.75969: variable 'ansible_shell_executable' from source: unknown 11389 1726854866.75977: variable 'ansible_connection' from source: unknown 11389 1726854866.75984: variable 'ansible_module_compression' from source: unknown 11389 1726854866.76003: variable 'ansible_shell_type' from source: unknown 11389 1726854866.76005: variable 'ansible_shell_executable' from source: unknown 11389 1726854866.76008: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854866.76048: variable 'ansible_pipelining' from source: unknown 11389 1726854866.76052: variable 'ansible_timeout' from source: unknown 11389 1726854866.76055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854866.76250: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11389 1726854866.76276: variable 'omit' from source: magic vars 11389 1726854866.76291: starting attempt loop 11389 1726854866.76295: running the handler 11389 1726854866.76330: _low_level_execute_command(): starting 11389 1726854866.76333: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854866.77091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854866.77107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854866.77150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854866.77175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11389 1726854866.77250: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854866.77264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854866.77297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854866.77316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854866.77337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854866.77440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854866.79212: stdout chunk (state=3): >>>/root <<< 11389 1726854866.79373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854866.79378: stdout chunk (state=3): >>><<< 11389 1726854866.79380: stderr chunk (state=3): >>><<< 11389 1726854866.79403: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854866.79423: _low_level_execute_command(): starting 11389 1726854866.79511: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854866.79411-12373-132313165494344 `" && echo ansible-tmp-1726854866.79411-12373-132313165494344="` echo /root/.ansible/tmp/ansible-tmp-1726854866.79411-12373-132313165494344 `" ) && sleep 0' 11389 1726854866.80081: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854866.80099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854866.80114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854866.80246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854866.80269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854866.80375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854866.82468: stdout chunk (state=3): >>>ansible-tmp-1726854866.79411-12373-132313165494344=/root/.ansible/tmp/ansible-tmp-1726854866.79411-12373-132313165494344 <<< 11389 1726854866.82759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854866.82763: stdout chunk (state=3): >>><<< 11389 1726854866.82765: stderr chunk (state=3): >>><<< 11389 1726854866.82994: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854866.79411-12373-132313165494344=/root/.ansible/tmp/ansible-tmp-1726854866.79411-12373-132313165494344 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854866.82998: variable 'ansible_module_compression' from source: unknown 11389 1726854866.83000: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11389 1726854866.83002: variable 'ansible_facts' from source: unknown 11389 1726854866.83042: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854866.79411-12373-132313165494344/AnsiballZ_stat.py 11389 1726854866.83254: Sending initial data 11389 1726854866.83257: Sent initial data (151 bytes) 11389 1726854866.83899: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854866.83943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854866.83960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854866.83986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854866.84078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854866.85658: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854866.85799: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854866.85910: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmpwa69fp8z /root/.ansible/tmp/ansible-tmp-1726854866.79411-12373-132313165494344/AnsiballZ_stat.py <<< 11389 1726854866.85913: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854866.79411-12373-132313165494344/AnsiballZ_stat.py" <<< 11389 1726854866.85972: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmpwa69fp8z" to remote "/root/.ansible/tmp/ansible-tmp-1726854866.79411-12373-132313165494344/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854866.79411-12373-132313165494344/AnsiballZ_stat.py" <<< 11389 1726854866.86809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854866.86853: stderr chunk (state=3): >>><<< 11389 1726854866.86856: stdout chunk (state=3): >>><<< 11389 1726854866.86870: done transferring module to remote 11389 1726854866.86886: _low_level_execute_command(): starting 11389 1726854866.86962: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854866.79411-12373-132313165494344/ /root/.ansible/tmp/ansible-tmp-1726854866.79411-12373-132313165494344/AnsiballZ_stat.py && sleep 0' 11389 1726854866.87549: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854866.87602: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854866.87680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854866.87709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854866.87754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854866.87816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854866.89769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854866.89773: stdout chunk (state=3): >>><<< 11389 1726854866.89776: stderr chunk (state=3): >>><<< 11389 1726854866.89779: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854866.89781: _low_level_execute_command(): starting 11389 1726854866.89783: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854866.79411-12373-132313165494344/AnsiballZ_stat.py && sleep 0' 11389 1726854866.90402: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854866.90474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854866.90493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854866.90515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854866.90602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854867.05678: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11389 1726854867.07015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854867.07043: stderr chunk (state=3): >>><<< 11389 1726854867.07046: stdout chunk (state=3): >>><<< 11389 1726854867.07063: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854867.07093: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854866.79411-12373-132313165494344/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854867.07103: _low_level_execute_command(): starting 11389 1726854867.07108: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854866.79411-12373-132313165494344/ > /dev/null 2>&1 && sleep 0' 11389 1726854867.07562: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854867.07568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854867.07570: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 11389 1726854867.07573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854867.07575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854867.07627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854867.07633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854867.07640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854867.07692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854867.09544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854867.09573: stderr chunk (state=3): >>><<< 11389 1726854867.09577: stdout chunk (state=3): >>><<< 11389 1726854867.09592: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854867.09598: handler run complete 11389 1726854867.09614: attempt loop complete, returning result 11389 1726854867.09617: _execute() done 11389 1726854867.09620: dumping result to json 11389 1726854867.09623: done dumping result, returning 11389 1726854867.09631: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0affcc66-ac2b-deb8-c119-000000000441] 11389 1726854867.09635: sending task result for task 0affcc66-ac2b-deb8-c119-000000000441 11389 1726854867.09733: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000441 11389 1726854867.09736: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 11389 1726854867.09792: no more pending results, returning what we have 11389 1726854867.09796: results queue empty 11389 1726854867.09797: checking for any_errors_fatal 11389 1726854867.09803: done checking for any_errors_fatal 11389 1726854867.09804: checking for max_fail_percentage 11389 1726854867.09806: done checking for max_fail_percentage 11389 1726854867.09807: checking to see if all hosts have failed and the running result is not ok 11389 1726854867.09808: done checking to see if all hosts have failed 11389 1726854867.09808: getting the remaining hosts for this loop 11389 1726854867.09810: done getting the remaining hosts for this loop 11389 1726854867.09813: getting the next task for host managed_node3 11389 1726854867.09820: done getting next task for host managed_node3 11389 1726854867.09823: ^ task is: TASK: Set NM profile exist flag based on the profile files 11389 1726854867.09826: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854867.09830: getting variables 11389 1726854867.09832: in VariableManager get_vars() 11389 1726854867.09872: Calling all_inventory to load vars for managed_node3 11389 1726854867.09875: Calling groups_inventory to load vars for managed_node3 11389 1726854867.09877: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854867.09890: Calling all_plugins_play to load vars for managed_node3 11389 1726854867.09892: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854867.09895: Calling groups_plugins_play to load vars for managed_node3 11389 1726854867.10818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854867.11665: done with get_vars() 11389 1726854867.11681: done getting variables 11389 1726854867.11726: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:54:27 -0400 (0:00:00.379) 0:00:19.540 ****** 11389 1726854867.11747: entering _queue_task() for managed_node3/set_fact 11389 1726854867.11968: worker is 1 (out of 1 available) 11389 1726854867.11981: exiting _queue_task() for managed_node3/set_fact 11389 1726854867.11994: done queuing things up, now waiting for results queue to drain 11389 1726854867.11996: waiting for pending results... 11389 1726854867.12166: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 11389 1726854867.12237: in run() - task 0affcc66-ac2b-deb8-c119-000000000442 11389 1726854867.12249: variable 'ansible_search_path' from source: unknown 11389 1726854867.12253: variable 'ansible_search_path' from source: unknown 11389 1726854867.12283: calling self._execute() 11389 1726854867.12356: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.12361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.12371: variable 'omit' from source: magic vars 11389 1726854867.12643: variable 'ansible_distribution_major_version' from source: facts 11389 1726854867.12655: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854867.12739: variable 'profile_stat' from source: set_fact 11389 1726854867.12750: Evaluated conditional (profile_stat.stat.exists): False 11389 1726854867.12753: when evaluation is False, skipping this task 11389 1726854867.12756: _execute() done 11389 1726854867.12759: dumping result to json 11389 1726854867.12761: done dumping result, returning 11389 1726854867.12774: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0affcc66-ac2b-deb8-c119-000000000442] 11389 1726854867.12777: sending task result for task 0affcc66-ac2b-deb8-c119-000000000442 11389 1726854867.12848: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000442 11389 1726854867.12851: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11389 1726854867.12921: no more pending results, returning what we have 11389 1726854867.12924: results queue empty 11389 1726854867.12925: checking for any_errors_fatal 11389 1726854867.12931: done checking for any_errors_fatal 11389 1726854867.12932: checking for max_fail_percentage 11389 1726854867.12934: done checking for max_fail_percentage 11389 1726854867.12934: checking to see if all hosts have failed and the running result is not ok 11389 1726854867.12935: done checking to see if all hosts have failed 11389 1726854867.12936: getting the remaining hosts for this loop 11389 1726854867.12937: done getting the remaining hosts for this loop 11389 1726854867.12940: getting the next task for host managed_node3 11389 1726854867.12945: done getting next task for host managed_node3 11389 1726854867.12947: ^ task is: TASK: Get NM profile info 11389 1726854867.12950: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854867.12953: getting variables 11389 1726854867.12955: in VariableManager get_vars() 11389 1726854867.12991: Calling all_inventory to load vars for managed_node3 11389 1726854867.12993: Calling groups_inventory to load vars for managed_node3 11389 1726854867.12995: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854867.13005: Calling all_plugins_play to load vars for managed_node3 11389 1726854867.13007: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854867.13009: Calling groups_plugins_play to load vars for managed_node3 11389 1726854867.13739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854867.14600: done with get_vars() 11389 1726854867.14614: done getting variables 11389 1726854867.14654: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:54:27 -0400 (0:00:00.029) 0:00:19.569 ****** 11389 1726854867.14678: entering _queue_task() for managed_node3/shell 11389 1726854867.14877: worker is 1 (out of 1 available) 11389 1726854867.14893: exiting _queue_task() for managed_node3/shell 11389 1726854867.14903: done queuing things up, now waiting for results queue to drain 11389 1726854867.14905: waiting for pending results... 11389 1726854867.15061: running TaskExecutor() for managed_node3/TASK: Get NM profile info 11389 1726854867.15128: in run() - task 0affcc66-ac2b-deb8-c119-000000000443 11389 1726854867.15140: variable 'ansible_search_path' from source: unknown 11389 1726854867.15144: variable 'ansible_search_path' from source: unknown 11389 1726854867.15172: calling self._execute() 11389 1726854867.15237: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.15243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.15254: variable 'omit' from source: magic vars 11389 1726854867.15504: variable 'ansible_distribution_major_version' from source: facts 11389 1726854867.15514: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854867.15520: variable 'omit' from source: magic vars 11389 1726854867.15548: variable 'omit' from source: magic vars 11389 1726854867.15619: variable 'profile' from source: include params 11389 1726854867.15623: variable 'item' from source: include params 11389 1726854867.15668: variable 'item' from source: include params 11389 1726854867.15680: variable 'omit' from source: magic vars 11389 1726854867.15715: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854867.15740: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854867.15756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854867.15771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854867.15778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854867.15805: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854867.15808: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.15812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.15878: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854867.15885: Set connection var ansible_timeout to 10 11389 1726854867.15889: Set connection var ansible_connection to ssh 11389 1726854867.15894: Set connection var ansible_shell_type to sh 11389 1726854867.15902: Set connection var ansible_pipelining to False 11389 1726854867.15907: Set connection var ansible_shell_executable to /bin/sh 11389 1726854867.15922: variable 'ansible_shell_executable' from source: unknown 11389 1726854867.15924: variable 'ansible_connection' from source: unknown 11389 1726854867.15927: variable 'ansible_module_compression' from source: unknown 11389 1726854867.15929: variable 'ansible_shell_type' from source: unknown 11389 1726854867.15931: variable 'ansible_shell_executable' from source: unknown 11389 1726854867.15933: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.15937: variable 'ansible_pipelining' from source: unknown 11389 1726854867.15940: variable 'ansible_timeout' from source: unknown 11389 1726854867.15944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.16041: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854867.16050: variable 'omit' from source: magic vars 11389 1726854867.16055: starting attempt loop 11389 1726854867.16058: running the handler 11389 1726854867.16069: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854867.16082: _low_level_execute_command(): starting 11389 1726854867.16090: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854867.16579: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854867.16609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854867.16613: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854867.16616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854867.16670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854867.16673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854867.16675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854867.16748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854867.18426: stdout chunk (state=3): >>>/root <<< 11389 1726854867.18585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854867.18621: stderr chunk (state=3): >>><<< 11389 1726854867.18624: stdout chunk (state=3): >>><<< 11389 1726854867.18641: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854867.18652: _low_level_execute_command(): starting 11389 1726854867.18658: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854867.1864092-12386-188443652426532 `" && echo ansible-tmp-1726854867.1864092-12386-188443652426532="` echo /root/.ansible/tmp/ansible-tmp-1726854867.1864092-12386-188443652426532 `" ) && sleep 0' 11389 1726854867.19076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854867.19092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854867.19096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854867.19099: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 11389 1726854867.19115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854867.19117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854867.19173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854867.19177: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854867.19179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854867.19245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854867.21126: stdout chunk (state=3): >>>ansible-tmp-1726854867.1864092-12386-188443652426532=/root/.ansible/tmp/ansible-tmp-1726854867.1864092-12386-188443652426532 <<< 11389 1726854867.21317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854867.21340: stderr chunk (state=3): >>><<< 11389 1726854867.21343: stdout chunk (state=3): >>><<< 11389 1726854867.21356: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854867.1864092-12386-188443652426532=/root/.ansible/tmp/ansible-tmp-1726854867.1864092-12386-188443652426532 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854867.21383: variable 'ansible_module_compression' from source: unknown 11389 1726854867.21426: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11389 1726854867.21459: variable 'ansible_facts' from source: unknown 11389 1726854867.21512: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854867.1864092-12386-188443652426532/AnsiballZ_command.py 11389 1726854867.21608: Sending initial data 11389 1726854867.21611: Sent initial data (156 bytes) 11389 1726854867.22046: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854867.22051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854867.22053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854867.22057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854867.22059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854867.22108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854867.22111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854867.22178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854867.23722: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854867.23799: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854867.23859: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmpcvzd9xk_ /root/.ansible/tmp/ansible-tmp-1726854867.1864092-12386-188443652426532/AnsiballZ_command.py <<< 11389 1726854867.23865: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854867.1864092-12386-188443652426532/AnsiballZ_command.py" <<< 11389 1726854867.23916: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmpcvzd9xk_" to remote "/root/.ansible/tmp/ansible-tmp-1726854867.1864092-12386-188443652426532/AnsiballZ_command.py" <<< 11389 1726854867.23919: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854867.1864092-12386-188443652426532/AnsiballZ_command.py" <<< 11389 1726854867.24518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854867.24560: stderr chunk (state=3): >>><<< 11389 1726854867.24564: stdout chunk (state=3): >>><<< 11389 1726854867.24582: done transferring module to remote 11389 1726854867.24593: _low_level_execute_command(): starting 11389 1726854867.24598: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854867.1864092-12386-188443652426532/ /root/.ansible/tmp/ansible-tmp-1726854867.1864092-12386-188443652426532/AnsiballZ_command.py && sleep 0' 11389 1726854867.25046: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854867.25054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854867.25057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854867.25059: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854867.25061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854867.25114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854867.25121: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854867.25123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854867.25177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854867.26960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854867.26984: stderr chunk (state=3): >>><<< 11389 1726854867.26989: stdout chunk (state=3): >>><<< 11389 1726854867.27001: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854867.27004: _low_level_execute_command(): starting 11389 1726854867.27010: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854867.1864092-12386-188443652426532/AnsiballZ_command.py && sleep 0' 11389 1726854867.27425: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854867.27429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854867.27431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11389 1726854867.27433: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854867.27435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854867.27485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854867.27493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854867.27554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854867.44830: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 13:54:27.426512", "end": "2024-09-20 13:54:27.447412", "delta": "0:00:00.020900", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11389 1726854867.46362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854867.46393: stderr chunk (state=3): >>><<< 11389 1726854867.46397: stdout chunk (state=3): >>><<< 11389 1726854867.46416: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 13:54:27.426512", "end": "2024-09-20 13:54:27.447412", "delta": "0:00:00.020900", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854867.46448: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854867.1864092-12386-188443652426532/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854867.46456: _low_level_execute_command(): starting 11389 1726854867.46459: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854867.1864092-12386-188443652426532/ > /dev/null 2>&1 && sleep 0' 11389 1726854867.46918: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854867.46922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854867.46924: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 11389 1726854867.46926: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854867.46928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854867.46983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854867.46992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854867.46996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854867.47048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854867.48895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854867.48901: stdout chunk (state=3): >>><<< 11389 1726854867.48915: stderr chunk (state=3): >>><<< 11389 1726854867.49093: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854867.49096: handler run complete 11389 1726854867.49098: Evaluated conditional (False): False 11389 1726854867.49100: attempt loop complete, returning result 11389 1726854867.49102: _execute() done 11389 1726854867.49104: dumping result to json 11389 1726854867.49106: done dumping result, returning 11389 1726854867.49108: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0affcc66-ac2b-deb8-c119-000000000443] 11389 1726854867.49110: sending task result for task 0affcc66-ac2b-deb8-c119-000000000443 11389 1726854867.49177: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000443 11389 1726854867.49180: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.020900", "end": "2024-09-20 13:54:27.447412", "rc": 0, "start": "2024-09-20 13:54:27.426512" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 11389 1726854867.49257: no more pending results, returning what we have 11389 1726854867.49260: results queue empty 11389 1726854867.49261: checking for any_errors_fatal 11389 1726854867.49267: done checking for any_errors_fatal 11389 1726854867.49268: checking for max_fail_percentage 11389 1726854867.49270: done checking for max_fail_percentage 11389 1726854867.49271: checking to see if all hosts have failed and the running result is not ok 11389 1726854867.49272: done checking to see if all hosts have failed 11389 1726854867.49273: getting the remaining hosts for this loop 11389 1726854867.49274: done getting the remaining hosts for this loop 11389 1726854867.49278: getting the next task for host managed_node3 11389 1726854867.49285: done getting next task for host managed_node3 11389 1726854867.49294: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11389 1726854867.49299: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854867.49303: getting variables 11389 1726854867.49304: in VariableManager get_vars() 11389 1726854867.49345: Calling all_inventory to load vars for managed_node3 11389 1726854867.49347: Calling groups_inventory to load vars for managed_node3 11389 1726854867.49350: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854867.49362: Calling all_plugins_play to load vars for managed_node3 11389 1726854867.49365: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854867.49368: Calling groups_plugins_play to load vars for managed_node3 11389 1726854867.50791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854867.51641: done with get_vars() 11389 1726854867.51659: done getting variables 11389 1726854867.51704: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:54:27 -0400 (0:00:00.370) 0:00:19.940 ****** 11389 1726854867.51726: entering _queue_task() for managed_node3/set_fact 11389 1726854867.51968: worker is 1 (out of 1 available) 11389 1726854867.51981: exiting _queue_task() for managed_node3/set_fact 11389 1726854867.51994: done queuing things up, now waiting for results queue to drain 11389 1726854867.51996: waiting for pending results... 11389 1726854867.52168: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11389 1726854867.52244: in run() - task 0affcc66-ac2b-deb8-c119-000000000444 11389 1726854867.52258: variable 'ansible_search_path' from source: unknown 11389 1726854867.52261: variable 'ansible_search_path' from source: unknown 11389 1726854867.52327: calling self._execute() 11389 1726854867.52436: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.52442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.52445: variable 'omit' from source: magic vars 11389 1726854867.52893: variable 'ansible_distribution_major_version' from source: facts 11389 1726854867.52897: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854867.52906: variable 'nm_profile_exists' from source: set_fact 11389 1726854867.52930: Evaluated conditional (nm_profile_exists.rc == 0): True 11389 1726854867.52943: variable 'omit' from source: magic vars 11389 1726854867.52999: variable 'omit' from source: magic vars 11389 1726854867.53035: variable 'omit' from source: magic vars 11389 1726854867.53082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854867.53124: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854867.53144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854867.53161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854867.53176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854867.53208: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854867.53211: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.53213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.53312: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854867.53320: Set connection var ansible_timeout to 10 11389 1726854867.53322: Set connection var ansible_connection to ssh 11389 1726854867.53330: Set connection var ansible_shell_type to sh 11389 1726854867.53339: Set connection var ansible_pipelining to False 11389 1726854867.53342: Set connection var ansible_shell_executable to /bin/sh 11389 1726854867.53355: variable 'ansible_shell_executable' from source: unknown 11389 1726854867.53362: variable 'ansible_connection' from source: unknown 11389 1726854867.53492: variable 'ansible_module_compression' from source: unknown 11389 1726854867.53495: variable 'ansible_shell_type' from source: unknown 11389 1726854867.53497: variable 'ansible_shell_executable' from source: unknown 11389 1726854867.53499: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.53501: variable 'ansible_pipelining' from source: unknown 11389 1726854867.53503: variable 'ansible_timeout' from source: unknown 11389 1726854867.53506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.53542: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854867.53559: variable 'omit' from source: magic vars 11389 1726854867.53571: starting attempt loop 11389 1726854867.53578: running the handler 11389 1726854867.53595: handler run complete 11389 1726854867.53609: attempt loop complete, returning result 11389 1726854867.53615: _execute() done 11389 1726854867.53621: dumping result to json 11389 1726854867.53628: done dumping result, returning 11389 1726854867.53639: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcc66-ac2b-deb8-c119-000000000444] 11389 1726854867.53648: sending task result for task 0affcc66-ac2b-deb8-c119-000000000444 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11389 1726854867.53814: no more pending results, returning what we have 11389 1726854867.53817: results queue empty 11389 1726854867.53818: checking for any_errors_fatal 11389 1726854867.53824: done checking for any_errors_fatal 11389 1726854867.53825: checking for max_fail_percentage 11389 1726854867.53827: done checking for max_fail_percentage 11389 1726854867.53828: checking to see if all hosts have failed and the running result is not ok 11389 1726854867.53829: done checking to see if all hosts have failed 11389 1726854867.53829: getting the remaining hosts for this loop 11389 1726854867.53831: done getting the remaining hosts for this loop 11389 1726854867.53834: getting the next task for host managed_node3 11389 1726854867.53843: done getting next task for host managed_node3 11389 1726854867.53845: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11389 1726854867.53849: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854867.53854: getting variables 11389 1726854867.53855: in VariableManager get_vars() 11389 1726854867.53896: Calling all_inventory to load vars for managed_node3 11389 1726854867.53899: Calling groups_inventory to load vars for managed_node3 11389 1726854867.53901: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854867.53912: Calling all_plugins_play to load vars for managed_node3 11389 1726854867.53915: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854867.53920: Calling groups_plugins_play to load vars for managed_node3 11389 1726854867.54511: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000444 11389 1726854867.54515: WORKER PROCESS EXITING 11389 1726854867.55297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854867.57080: done with get_vars() 11389 1726854867.57103: done getting variables 11389 1726854867.57164: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854867.57282: variable 'profile' from source: include params 11389 1726854867.57286: variable 'item' from source: include params 11389 1726854867.57345: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:54:27 -0400 (0:00:00.056) 0:00:19.996 ****** 11389 1726854867.57390: entering _queue_task() for managed_node3/command 11389 1726854867.57734: worker is 1 (out of 1 available) 11389 1726854867.57745: exiting _queue_task() for managed_node3/command 11389 1726854867.57756: done queuing things up, now waiting for results queue to drain 11389 1726854867.57758: waiting for pending results... 11389 1726854867.58043: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 11389 1726854867.58169: in run() - task 0affcc66-ac2b-deb8-c119-000000000446 11389 1726854867.58194: variable 'ansible_search_path' from source: unknown 11389 1726854867.58203: variable 'ansible_search_path' from source: unknown 11389 1726854867.58252: calling self._execute() 11389 1726854867.58351: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.58361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.58376: variable 'omit' from source: magic vars 11389 1726854867.58752: variable 'ansible_distribution_major_version' from source: facts 11389 1726854867.58782: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854867.58986: variable 'profile_stat' from source: set_fact 11389 1726854867.58991: Evaluated conditional (profile_stat.stat.exists): False 11389 1726854867.58994: when evaluation is False, skipping this task 11389 1726854867.58996: _execute() done 11389 1726854867.58999: dumping result to json 11389 1726854867.59001: done dumping result, returning 11389 1726854867.59003: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [0affcc66-ac2b-deb8-c119-000000000446] 11389 1726854867.59006: sending task result for task 0affcc66-ac2b-deb8-c119-000000000446 11389 1726854867.59298: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000446 11389 1726854867.59301: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11389 1726854867.59348: no more pending results, returning what we have 11389 1726854867.59352: results queue empty 11389 1726854867.59353: checking for any_errors_fatal 11389 1726854867.59359: done checking for any_errors_fatal 11389 1726854867.59360: checking for max_fail_percentage 11389 1726854867.59362: done checking for max_fail_percentage 11389 1726854867.59363: checking to see if all hosts have failed and the running result is not ok 11389 1726854867.59364: done checking to see if all hosts have failed 11389 1726854867.59365: getting the remaining hosts for this loop 11389 1726854867.59366: done getting the remaining hosts for this loop 11389 1726854867.59369: getting the next task for host managed_node3 11389 1726854867.59376: done getting next task for host managed_node3 11389 1726854867.59378: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11389 1726854867.59382: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854867.59386: getting variables 11389 1726854867.59390: in VariableManager get_vars() 11389 1726854867.59429: Calling all_inventory to load vars for managed_node3 11389 1726854867.59433: Calling groups_inventory to load vars for managed_node3 11389 1726854867.59435: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854867.59446: Calling all_plugins_play to load vars for managed_node3 11389 1726854867.59449: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854867.59452: Calling groups_plugins_play to load vars for managed_node3 11389 1726854867.60864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854867.62500: done with get_vars() 11389 1726854867.62537: done getting variables 11389 1726854867.62603: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854867.62728: variable 'profile' from source: include params 11389 1726854867.62733: variable 'item' from source: include params 11389 1726854867.62802: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:54:27 -0400 (0:00:00.054) 0:00:20.051 ****** 11389 1726854867.62834: entering _queue_task() for managed_node3/set_fact 11389 1726854867.63213: worker is 1 (out of 1 available) 11389 1726854867.63225: exiting _queue_task() for managed_node3/set_fact 11389 1726854867.63239: done queuing things up, now waiting for results queue to drain 11389 1726854867.63241: waiting for pending results... 11389 1726854867.63619: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 11389 1726854867.63657: in run() - task 0affcc66-ac2b-deb8-c119-000000000447 11389 1726854867.63673: variable 'ansible_search_path' from source: unknown 11389 1726854867.63677: variable 'ansible_search_path' from source: unknown 11389 1726854867.63794: calling self._execute() 11389 1726854867.63821: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.63825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.63829: variable 'omit' from source: magic vars 11389 1726854867.64192: variable 'ansible_distribution_major_version' from source: facts 11389 1726854867.64205: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854867.64329: variable 'profile_stat' from source: set_fact 11389 1726854867.64344: Evaluated conditional (profile_stat.stat.exists): False 11389 1726854867.64347: when evaluation is False, skipping this task 11389 1726854867.64353: _execute() done 11389 1726854867.64356: dumping result to json 11389 1726854867.64361: done dumping result, returning 11389 1726854867.64371: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [0affcc66-ac2b-deb8-c119-000000000447] 11389 1726854867.64377: sending task result for task 0affcc66-ac2b-deb8-c119-000000000447 11389 1726854867.64493: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000447 11389 1726854867.64609: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11389 1726854867.64654: no more pending results, returning what we have 11389 1726854867.64657: results queue empty 11389 1726854867.64658: checking for any_errors_fatal 11389 1726854867.64664: done checking for any_errors_fatal 11389 1726854867.64664: checking for max_fail_percentage 11389 1726854867.64669: done checking for max_fail_percentage 11389 1726854867.64670: checking to see if all hosts have failed and the running result is not ok 11389 1726854867.64671: done checking to see if all hosts have failed 11389 1726854867.64671: getting the remaining hosts for this loop 11389 1726854867.64673: done getting the remaining hosts for this loop 11389 1726854867.64676: getting the next task for host managed_node3 11389 1726854867.64681: done getting next task for host managed_node3 11389 1726854867.64684: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11389 1726854867.64689: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854867.64693: getting variables 11389 1726854867.64694: in VariableManager get_vars() 11389 1726854867.64731: Calling all_inventory to load vars for managed_node3 11389 1726854867.64733: Calling groups_inventory to load vars for managed_node3 11389 1726854867.64735: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854867.64745: Calling all_plugins_play to load vars for managed_node3 11389 1726854867.64748: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854867.64751: Calling groups_plugins_play to load vars for managed_node3 11389 1726854867.70810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854867.72382: done with get_vars() 11389 1726854867.72413: done getting variables 11389 1726854867.72461: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854867.72558: variable 'profile' from source: include params 11389 1726854867.72561: variable 'item' from source: include params 11389 1726854867.72623: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:54:27 -0400 (0:00:00.098) 0:00:20.149 ****** 11389 1726854867.72650: entering _queue_task() for managed_node3/command 11389 1726854867.72998: worker is 1 (out of 1 available) 11389 1726854867.73009: exiting _queue_task() for managed_node3/command 11389 1726854867.73020: done queuing things up, now waiting for results queue to drain 11389 1726854867.73021: waiting for pending results... 11389 1726854867.73262: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 11389 1726854867.73439: in run() - task 0affcc66-ac2b-deb8-c119-000000000448 11389 1726854867.73443: variable 'ansible_search_path' from source: unknown 11389 1726854867.73446: variable 'ansible_search_path' from source: unknown 11389 1726854867.73449: calling self._execute() 11389 1726854867.73546: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.73552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.73560: variable 'omit' from source: magic vars 11389 1726854867.73943: variable 'ansible_distribution_major_version' from source: facts 11389 1726854867.73957: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854867.74075: variable 'profile_stat' from source: set_fact 11389 1726854867.74090: Evaluated conditional (profile_stat.stat.exists): False 11389 1726854867.74094: when evaluation is False, skipping this task 11389 1726854867.74097: _execute() done 11389 1726854867.74100: dumping result to json 11389 1726854867.74103: done dumping result, returning 11389 1726854867.74106: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 [0affcc66-ac2b-deb8-c119-000000000448] 11389 1726854867.74111: sending task result for task 0affcc66-ac2b-deb8-c119-000000000448 11389 1726854867.74299: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000448 11389 1726854867.74304: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11389 1726854867.74356: no more pending results, returning what we have 11389 1726854867.74360: results queue empty 11389 1726854867.74361: checking for any_errors_fatal 11389 1726854867.74370: done checking for any_errors_fatal 11389 1726854867.74371: checking for max_fail_percentage 11389 1726854867.74373: done checking for max_fail_percentage 11389 1726854867.74374: checking to see if all hosts have failed and the running result is not ok 11389 1726854867.74375: done checking to see if all hosts have failed 11389 1726854867.74376: getting the remaining hosts for this loop 11389 1726854867.74377: done getting the remaining hosts for this loop 11389 1726854867.74381: getting the next task for host managed_node3 11389 1726854867.74391: done getting next task for host managed_node3 11389 1726854867.74394: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11389 1726854867.74397: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854867.74403: getting variables 11389 1726854867.74404: in VariableManager get_vars() 11389 1726854867.74450: Calling all_inventory to load vars for managed_node3 11389 1726854867.74453: Calling groups_inventory to load vars for managed_node3 11389 1726854867.74456: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854867.74473: Calling all_plugins_play to load vars for managed_node3 11389 1726854867.74476: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854867.74480: Calling groups_plugins_play to load vars for managed_node3 11389 1726854867.75935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854867.77515: done with get_vars() 11389 1726854867.77537: done getting variables 11389 1726854867.77596: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854867.77707: variable 'profile' from source: include params 11389 1726854867.77711: variable 'item' from source: include params 11389 1726854867.77769: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:54:27 -0400 (0:00:00.051) 0:00:20.201 ****** 11389 1726854867.77802: entering _queue_task() for managed_node3/set_fact 11389 1726854867.78115: worker is 1 (out of 1 available) 11389 1726854867.78128: exiting _queue_task() for managed_node3/set_fact 11389 1726854867.78141: done queuing things up, now waiting for results queue to drain 11389 1726854867.78143: waiting for pending results... 11389 1726854867.78351: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 11389 1726854867.78480: in run() - task 0affcc66-ac2b-deb8-c119-000000000449 11389 1726854867.78503: variable 'ansible_search_path' from source: unknown 11389 1726854867.78693: variable 'ansible_search_path' from source: unknown 11389 1726854867.78696: calling self._execute() 11389 1726854867.78699: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.78702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.78704: variable 'omit' from source: magic vars 11389 1726854867.79049: variable 'ansible_distribution_major_version' from source: facts 11389 1726854867.79066: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854867.79194: variable 'profile_stat' from source: set_fact 11389 1726854867.79213: Evaluated conditional (profile_stat.stat.exists): False 11389 1726854867.79221: when evaluation is False, skipping this task 11389 1726854867.79227: _execute() done 11389 1726854867.79234: dumping result to json 11389 1726854867.79242: done dumping result, returning 11389 1726854867.79256: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [0affcc66-ac2b-deb8-c119-000000000449] 11389 1726854867.79268: sending task result for task 0affcc66-ac2b-deb8-c119-000000000449 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11389 1726854867.79527: no more pending results, returning what we have 11389 1726854867.79531: results queue empty 11389 1726854867.79532: checking for any_errors_fatal 11389 1726854867.79538: done checking for any_errors_fatal 11389 1726854867.79539: checking for max_fail_percentage 11389 1726854867.79541: done checking for max_fail_percentage 11389 1726854867.79541: checking to see if all hosts have failed and the running result is not ok 11389 1726854867.79542: done checking to see if all hosts have failed 11389 1726854867.79543: getting the remaining hosts for this loop 11389 1726854867.79544: done getting the remaining hosts for this loop 11389 1726854867.79547: getting the next task for host managed_node3 11389 1726854867.79555: done getting next task for host managed_node3 11389 1726854867.79557: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11389 1726854867.79559: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854867.79563: getting variables 11389 1726854867.79565: in VariableManager get_vars() 11389 1726854867.79605: Calling all_inventory to load vars for managed_node3 11389 1726854867.79607: Calling groups_inventory to load vars for managed_node3 11389 1726854867.79610: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854867.79619: Calling all_plugins_play to load vars for managed_node3 11389 1726854867.79622: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854867.79625: Calling groups_plugins_play to load vars for managed_node3 11389 1726854867.80142: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000449 11389 1726854867.80147: WORKER PROCESS EXITING 11389 1726854867.81358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854867.82949: done with get_vars() 11389 1726854867.82977: done getting variables 11389 1726854867.83036: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854867.83157: variable 'profile' from source: include params 11389 1726854867.83161: variable 'item' from source: include params 11389 1726854867.83224: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:54:27 -0400 (0:00:00.054) 0:00:20.255 ****** 11389 1726854867.83254: entering _queue_task() for managed_node3/assert 11389 1726854867.83601: worker is 1 (out of 1 available) 11389 1726854867.83616: exiting _queue_task() for managed_node3/assert 11389 1726854867.83628: done queuing things up, now waiting for results queue to drain 11389 1726854867.83630: waiting for pending results... 11389 1726854867.83904: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' 11389 1726854867.83964: in run() - task 0affcc66-ac2b-deb8-c119-00000000026e 11389 1726854867.83986: variable 'ansible_search_path' from source: unknown 11389 1726854867.84001: variable 'ansible_search_path' from source: unknown 11389 1726854867.84041: calling self._execute() 11389 1726854867.84144: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.84155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.84212: variable 'omit' from source: magic vars 11389 1726854867.84557: variable 'ansible_distribution_major_version' from source: facts 11389 1726854867.84573: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854867.84585: variable 'omit' from source: magic vars 11389 1726854867.84632: variable 'omit' from source: magic vars 11389 1726854867.84728: variable 'profile' from source: include params 11389 1726854867.84736: variable 'item' from source: include params 11389 1726854867.84866: variable 'item' from source: include params 11389 1726854867.84870: variable 'omit' from source: magic vars 11389 1726854867.84873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854867.84974: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854867.84995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854867.85019: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854867.85031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854867.85065: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854867.85069: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.85074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.85293: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854867.85298: Set connection var ansible_timeout to 10 11389 1726854867.85301: Set connection var ansible_connection to ssh 11389 1726854867.85304: Set connection var ansible_shell_type to sh 11389 1726854867.85306: Set connection var ansible_pipelining to False 11389 1726854867.85309: Set connection var ansible_shell_executable to /bin/sh 11389 1726854867.85311: variable 'ansible_shell_executable' from source: unknown 11389 1726854867.85313: variable 'ansible_connection' from source: unknown 11389 1726854867.85316: variable 'ansible_module_compression' from source: unknown 11389 1726854867.85318: variable 'ansible_shell_type' from source: unknown 11389 1726854867.85320: variable 'ansible_shell_executable' from source: unknown 11389 1726854867.85323: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.85325: variable 'ansible_pipelining' from source: unknown 11389 1726854867.85349: variable 'ansible_timeout' from source: unknown 11389 1726854867.85352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.85410: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854867.85420: variable 'omit' from source: magic vars 11389 1726854867.85425: starting attempt loop 11389 1726854867.85428: running the handler 11389 1726854867.85555: variable 'lsr_net_profile_exists' from source: set_fact 11389 1726854867.85559: Evaluated conditional (lsr_net_profile_exists): True 11389 1726854867.85566: handler run complete 11389 1726854867.85584: attempt loop complete, returning result 11389 1726854867.85589: _execute() done 11389 1726854867.85592: dumping result to json 11389 1726854867.85594: done dumping result, returning 11389 1726854867.85601: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' [0affcc66-ac2b-deb8-c119-00000000026e] 11389 1726854867.85606: sending task result for task 0affcc66-ac2b-deb8-c119-00000000026e 11389 1726854867.85719: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000026e 11389 1726854867.85723: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11389 1726854867.85891: no more pending results, returning what we have 11389 1726854867.85895: results queue empty 11389 1726854867.85896: checking for any_errors_fatal 11389 1726854867.85901: done checking for any_errors_fatal 11389 1726854867.85902: checking for max_fail_percentage 11389 1726854867.85904: done checking for max_fail_percentage 11389 1726854867.85905: checking to see if all hosts have failed and the running result is not ok 11389 1726854867.85906: done checking to see if all hosts have failed 11389 1726854867.85907: getting the remaining hosts for this loop 11389 1726854867.85908: done getting the remaining hosts for this loop 11389 1726854867.85911: getting the next task for host managed_node3 11389 1726854867.85918: done getting next task for host managed_node3 11389 1726854867.85921: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11389 1726854867.85924: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854867.85928: getting variables 11389 1726854867.85929: in VariableManager get_vars() 11389 1726854867.85976: Calling all_inventory to load vars for managed_node3 11389 1726854867.85979: Calling groups_inventory to load vars for managed_node3 11389 1726854867.85982: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854867.85995: Calling all_plugins_play to load vars for managed_node3 11389 1726854867.85998: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854867.86002: Calling groups_plugins_play to load vars for managed_node3 11389 1726854867.87486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854867.89136: done with get_vars() 11389 1726854867.89161: done getting variables 11389 1726854867.89223: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854867.89348: variable 'profile' from source: include params 11389 1726854867.89351: variable 'item' from source: include params 11389 1726854867.89413: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:54:27 -0400 (0:00:00.061) 0:00:20.317 ****** 11389 1726854867.89449: entering _queue_task() for managed_node3/assert 11389 1726854867.89801: worker is 1 (out of 1 available) 11389 1726854867.89814: exiting _queue_task() for managed_node3/assert 11389 1726854867.89825: done queuing things up, now waiting for results queue to drain 11389 1726854867.89827: waiting for pending results... 11389 1726854867.90204: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' 11389 1726854867.90215: in run() - task 0affcc66-ac2b-deb8-c119-00000000026f 11389 1726854867.90226: variable 'ansible_search_path' from source: unknown 11389 1726854867.90249: variable 'ansible_search_path' from source: unknown 11389 1726854867.90294: calling self._execute() 11389 1726854867.90418: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.90440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.90467: variable 'omit' from source: magic vars 11389 1726854867.90932: variable 'ansible_distribution_major_version' from source: facts 11389 1726854867.90948: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854867.90960: variable 'omit' from source: magic vars 11389 1726854867.91077: variable 'omit' from source: magic vars 11389 1726854867.91128: variable 'profile' from source: include params 11389 1726854867.91139: variable 'item' from source: include params 11389 1726854867.91223: variable 'item' from source: include params 11389 1726854867.91256: variable 'omit' from source: magic vars 11389 1726854867.91306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854867.91379: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854867.91513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854867.91517: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854867.91530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854867.91770: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854867.91773: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.91776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.91835: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854867.91850: Set connection var ansible_timeout to 10 11389 1726854867.91858: Set connection var ansible_connection to ssh 11389 1726854867.91867: Set connection var ansible_shell_type to sh 11389 1726854867.91881: Set connection var ansible_pipelining to False 11389 1726854867.91891: Set connection var ansible_shell_executable to /bin/sh 11389 1726854867.91915: variable 'ansible_shell_executable' from source: unknown 11389 1726854867.91921: variable 'ansible_connection' from source: unknown 11389 1726854867.91927: variable 'ansible_module_compression' from source: unknown 11389 1726854867.91933: variable 'ansible_shell_type' from source: unknown 11389 1726854867.91939: variable 'ansible_shell_executable' from source: unknown 11389 1726854867.91986: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.91991: variable 'ansible_pipelining' from source: unknown 11389 1726854867.91994: variable 'ansible_timeout' from source: unknown 11389 1726854867.91996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.92100: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854867.92120: variable 'omit' from source: magic vars 11389 1726854867.92131: starting attempt loop 11389 1726854867.92138: running the handler 11389 1726854867.92246: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11389 1726854867.92310: Evaluated conditional (lsr_net_profile_ansible_managed): True 11389 1726854867.92313: handler run complete 11389 1726854867.92316: attempt loop complete, returning result 11389 1726854867.92318: _execute() done 11389 1726854867.92321: dumping result to json 11389 1726854867.92323: done dumping result, returning 11389 1726854867.92325: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' [0affcc66-ac2b-deb8-c119-00000000026f] 11389 1726854867.92327: sending task result for task 0affcc66-ac2b-deb8-c119-00000000026f ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11389 1726854867.92452: no more pending results, returning what we have 11389 1726854867.92455: results queue empty 11389 1726854867.92456: checking for any_errors_fatal 11389 1726854867.92462: done checking for any_errors_fatal 11389 1726854867.92463: checking for max_fail_percentage 11389 1726854867.92465: done checking for max_fail_percentage 11389 1726854867.92468: checking to see if all hosts have failed and the running result is not ok 11389 1726854867.92469: done checking to see if all hosts have failed 11389 1726854867.92469: getting the remaining hosts for this loop 11389 1726854867.92470: done getting the remaining hosts for this loop 11389 1726854867.92473: getting the next task for host managed_node3 11389 1726854867.92479: done getting next task for host managed_node3 11389 1726854867.92481: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11389 1726854867.92484: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854867.92493: getting variables 11389 1726854867.92494: in VariableManager get_vars() 11389 1726854867.92532: Calling all_inventory to load vars for managed_node3 11389 1726854867.92535: Calling groups_inventory to load vars for managed_node3 11389 1726854867.92538: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854867.92549: Calling all_plugins_play to load vars for managed_node3 11389 1726854867.92552: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854867.92556: Calling groups_plugins_play to load vars for managed_node3 11389 1726854867.93228: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000026f 11389 1726854867.93232: WORKER PROCESS EXITING 11389 1726854867.94140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854867.95721: done with get_vars() 11389 1726854867.95752: done getting variables 11389 1726854867.95815: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854867.95935: variable 'profile' from source: include params 11389 1726854867.95939: variable 'item' from source: include params 11389 1726854867.96006: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:54:27 -0400 (0:00:00.065) 0:00:20.383 ****** 11389 1726854867.96040: entering _queue_task() for managed_node3/assert 11389 1726854867.96370: worker is 1 (out of 1 available) 11389 1726854867.96393: exiting _queue_task() for managed_node3/assert 11389 1726854867.96404: done queuing things up, now waiting for results queue to drain 11389 1726854867.96406: waiting for pending results... 11389 1726854867.96627: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 11389 1726854867.96734: in run() - task 0affcc66-ac2b-deb8-c119-000000000270 11389 1726854867.96745: variable 'ansible_search_path' from source: unknown 11389 1726854867.96748: variable 'ansible_search_path' from source: unknown 11389 1726854867.96805: calling self._execute() 11389 1726854867.96941: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.96944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.96947: variable 'omit' from source: magic vars 11389 1726854867.97282: variable 'ansible_distribution_major_version' from source: facts 11389 1726854867.97294: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854867.97301: variable 'omit' from source: magic vars 11389 1726854867.97347: variable 'omit' from source: magic vars 11389 1726854867.97489: variable 'profile' from source: include params 11389 1726854867.97494: variable 'item' from source: include params 11389 1726854867.97522: variable 'item' from source: include params 11389 1726854867.97548: variable 'omit' from source: magic vars 11389 1726854867.97594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854867.97705: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854867.97708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854867.97711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854867.97714: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854867.97717: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854867.97720: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.97722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.97828: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854867.97837: Set connection var ansible_timeout to 10 11389 1726854867.97840: Set connection var ansible_connection to ssh 11389 1726854867.97843: Set connection var ansible_shell_type to sh 11389 1726854867.97848: Set connection var ansible_pipelining to False 11389 1726854867.97860: Set connection var ansible_shell_executable to /bin/sh 11389 1726854867.97921: variable 'ansible_shell_executable' from source: unknown 11389 1726854867.97925: variable 'ansible_connection' from source: unknown 11389 1726854867.97927: variable 'ansible_module_compression' from source: unknown 11389 1726854867.97929: variable 'ansible_shell_type' from source: unknown 11389 1726854867.97932: variable 'ansible_shell_executable' from source: unknown 11389 1726854867.97934: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854867.97936: variable 'ansible_pipelining' from source: unknown 11389 1726854867.97940: variable 'ansible_timeout' from source: unknown 11389 1726854867.97947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854867.98048: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854867.98140: variable 'omit' from source: magic vars 11389 1726854867.98144: starting attempt loop 11389 1726854867.98146: running the handler 11389 1726854867.98175: variable 'lsr_net_profile_fingerprint' from source: set_fact 11389 1726854867.98185: Evaluated conditional (lsr_net_profile_fingerprint): True 11389 1726854867.98193: handler run complete 11389 1726854867.98207: attempt loop complete, returning result 11389 1726854867.98209: _execute() done 11389 1726854867.98212: dumping result to json 11389 1726854867.98215: done dumping result, returning 11389 1726854867.98222: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 [0affcc66-ac2b-deb8-c119-000000000270] 11389 1726854867.98227: sending task result for task 0affcc66-ac2b-deb8-c119-000000000270 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11389 1726854867.98399: no more pending results, returning what we have 11389 1726854867.98402: results queue empty 11389 1726854867.98403: checking for any_errors_fatal 11389 1726854867.98411: done checking for any_errors_fatal 11389 1726854867.98411: checking for max_fail_percentage 11389 1726854867.98413: done checking for max_fail_percentage 11389 1726854867.98414: checking to see if all hosts have failed and the running result is not ok 11389 1726854867.98415: done checking to see if all hosts have failed 11389 1726854867.98415: getting the remaining hosts for this loop 11389 1726854867.98417: done getting the remaining hosts for this loop 11389 1726854867.98419: getting the next task for host managed_node3 11389 1726854867.98426: done getting next task for host managed_node3 11389 1726854867.98429: ^ task is: TASK: ** TEST check polling interval 11389 1726854867.98430: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854867.98434: getting variables 11389 1726854867.98435: in VariableManager get_vars() 11389 1726854867.98568: Calling all_inventory to load vars for managed_node3 11389 1726854867.98571: Calling groups_inventory to load vars for managed_node3 11389 1726854867.98573: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854867.98583: Calling all_plugins_play to load vars for managed_node3 11389 1726854867.98585: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854867.98591: Calling groups_plugins_play to load vars for managed_node3 11389 1726854867.99111: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000270 11389 1726854867.99115: WORKER PROCESS EXITING 11389 1726854867.99937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854868.01954: done with get_vars() 11389 1726854868.01976: done getting variables 11389 1726854868.02053: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:75 Friday 20 September 2024 13:54:28 -0400 (0:00:00.060) 0:00:20.443 ****** 11389 1726854868.02082: entering _queue_task() for managed_node3/command 11389 1726854868.02504: worker is 1 (out of 1 available) 11389 1726854868.02518: exiting _queue_task() for managed_node3/command 11389 1726854868.02530: done queuing things up, now waiting for results queue to drain 11389 1726854868.02532: waiting for pending results... 11389 1726854868.03013: running TaskExecutor() for managed_node3/TASK: ** TEST check polling interval 11389 1726854868.03018: in run() - task 0affcc66-ac2b-deb8-c119-000000000071 11389 1726854868.03022: variable 'ansible_search_path' from source: unknown 11389 1726854868.03026: calling self._execute() 11389 1726854868.03028: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854868.03030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854868.03033: variable 'omit' from source: magic vars 11389 1726854868.03522: variable 'ansible_distribution_major_version' from source: facts 11389 1726854868.03531: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854868.03538: variable 'omit' from source: magic vars 11389 1726854868.03658: variable 'omit' from source: magic vars 11389 1726854868.03661: variable 'controller_device' from source: play vars 11389 1726854868.03667: variable 'omit' from source: magic vars 11389 1726854868.03714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854868.03750: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854868.03776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854868.03793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854868.03804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854868.03836: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854868.03839: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854868.03842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854868.03943: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854868.03949: Set connection var ansible_timeout to 10 11389 1726854868.03952: Set connection var ansible_connection to ssh 11389 1726854868.03981: Set connection var ansible_shell_type to sh 11389 1726854868.03984: Set connection var ansible_pipelining to False 11389 1726854868.03989: Set connection var ansible_shell_executable to /bin/sh 11389 1726854868.03995: variable 'ansible_shell_executable' from source: unknown 11389 1726854868.03997: variable 'ansible_connection' from source: unknown 11389 1726854868.04000: variable 'ansible_module_compression' from source: unknown 11389 1726854868.04003: variable 'ansible_shell_type' from source: unknown 11389 1726854868.04005: variable 'ansible_shell_executable' from source: unknown 11389 1726854868.04007: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854868.04009: variable 'ansible_pipelining' from source: unknown 11389 1726854868.04012: variable 'ansible_timeout' from source: unknown 11389 1726854868.04014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854868.04201: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854868.04206: variable 'omit' from source: magic vars 11389 1726854868.04214: starting attempt loop 11389 1726854868.04216: running the handler 11389 1726854868.04218: _low_level_execute_command(): starting 11389 1726854868.04220: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854868.05252: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854868.05599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854868.05836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854868.05839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854868.07514: stdout chunk (state=3): >>>/root <<< 11389 1726854868.07818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854868.07822: stdout chunk (state=3): >>><<< 11389 1726854868.07825: stderr chunk (state=3): >>><<< 11389 1726854868.07829: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854868.07832: _low_level_execute_command(): starting 11389 1726854868.07835: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854868.0768573-12415-180957692233215 `" && echo ansible-tmp-1726854868.0768573-12415-180957692233215="` echo /root/.ansible/tmp/ansible-tmp-1726854868.0768573-12415-180957692233215 `" ) && sleep 0' 11389 1726854868.08318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854868.08327: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854868.08338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854868.08353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854868.08368: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854868.08373: stderr chunk (state=3): >>>debug2: match not found <<< 11389 1726854868.08386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854868.08401: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11389 1726854868.08411: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 11389 1726854868.08418: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11389 1726854868.08426: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854868.08435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854868.08448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854868.08457: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854868.08465: stderr chunk (state=3): >>>debug2: match found <<< 11389 1726854868.08473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854868.08539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854868.08551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854868.08571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854868.08659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854868.10594: stdout chunk (state=3): >>>ansible-tmp-1726854868.0768573-12415-180957692233215=/root/.ansible/tmp/ansible-tmp-1726854868.0768573-12415-180957692233215 <<< 11389 1726854868.10965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854868.10996: stdout chunk (state=3): >>><<< 11389 1726854868.10999: stderr chunk (state=3): >>><<< 11389 1726854868.11002: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854868.0768573-12415-180957692233215=/root/.ansible/tmp/ansible-tmp-1726854868.0768573-12415-180957692233215 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854868.11033: variable 'ansible_module_compression' from source: unknown 11389 1726854868.11092: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11389 1726854868.11397: variable 'ansible_facts' from source: unknown 11389 1726854868.11400: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854868.0768573-12415-180957692233215/AnsiballZ_command.py 11389 1726854868.11789: Sending initial data 11389 1726854868.11792: Sent initial data (156 bytes) 11389 1726854868.13161: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854868.13168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854868.13406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854868.13411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854868.13617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854868.15246: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11389 1726854868.15253: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11389 1726854868.15260: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11389 1726854868.15267: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 11389 1726854868.15279: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 11389 1726854868.15292: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854868.15363: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854868.15435: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmptrqst4sf /root/.ansible/tmp/ansible-tmp-1726854868.0768573-12415-180957692233215/AnsiballZ_command.py <<< 11389 1726854868.15446: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854868.0768573-12415-180957692233215/AnsiballZ_command.py" <<< 11389 1726854868.15514: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 11389 1726854868.15595: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmptrqst4sf" to remote "/root/.ansible/tmp/ansible-tmp-1726854868.0768573-12415-180957692233215/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854868.0768573-12415-180957692233215/AnsiballZ_command.py" <<< 11389 1726854868.17010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854868.17115: stderr chunk (state=3): >>><<< 11389 1726854868.17118: stdout chunk (state=3): >>><<< 11389 1726854868.17143: done transferring module to remote 11389 1726854868.17155: _low_level_execute_command(): starting 11389 1726854868.17158: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854868.0768573-12415-180957692233215/ /root/.ansible/tmp/ansible-tmp-1726854868.0768573-12415-180957692233215/AnsiballZ_command.py && sleep 0' 11389 1726854868.18398: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854868.18404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854868.18486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854868.18629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854868.18633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854868.18716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854868.20530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854868.20593: stderr chunk (state=3): >>><<< 11389 1726854868.20596: stdout chunk (state=3): >>><<< 11389 1726854868.20606: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854868.20609: _low_level_execute_command(): starting 11389 1726854868.20636: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854868.0768573-12415-180957692233215/AnsiballZ_command.py && sleep 0' 11389 1726854868.21285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854868.21303: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854868.21315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854868.21392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854868.21395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854868.21398: stderr chunk (state=3): >>>debug2: match not found <<< 11389 1726854868.21400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854868.21402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11389 1726854868.21412: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 11389 1726854868.21415: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11389 1726854868.21423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854868.21467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854868.21533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854868.21562: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854868.21589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854868.21686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854868.37173: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-20 13:54:28.367323", "end": "2024-09-20 13:54:28.370777", "delta": "0:00:00.003454", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11389 1726854868.38801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854868.38805: stdout chunk (state=3): >>><<< 11389 1726854868.38807: stderr chunk (state=3): >>><<< 11389 1726854868.38829: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-20 13:54:28.367323", "end": "2024-09-20 13:54:28.370777", "delta": "0:00:00.003454", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854868.38880: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/nm-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854868.0768573-12415-180957692233215/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854868.38913: _low_level_execute_command(): starting 11389 1726854868.38993: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854868.0768573-12415-180957692233215/ > /dev/null 2>&1 && sleep 0' 11389 1726854868.39652: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854868.39678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854868.39715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854868.39741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854868.39782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854868.39859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854868.39893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854868.39933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854868.40005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854868.41891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854868.41894: stdout chunk (state=3): >>><<< 11389 1726854868.41896: stderr chunk (state=3): >>><<< 11389 1726854868.41993: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854868.41996: handler run complete 11389 1726854868.41999: Evaluated conditional (False): False 11389 1726854868.42103: variable 'result' from source: unknown 11389 1726854868.42136: Evaluated conditional ('110' in result.stdout): True 11389 1726854868.42151: attempt loop complete, returning result 11389 1726854868.42158: _execute() done 11389 1726854868.42164: dumping result to json 11389 1726854868.42172: done dumping result, returning 11389 1726854868.42183: done running TaskExecutor() for managed_node3/TASK: ** TEST check polling interval [0affcc66-ac2b-deb8-c119-000000000071] 11389 1726854868.42194: sending task result for task 0affcc66-ac2b-deb8-c119-000000000071 ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/nm-bond" ], "delta": "0:00:00.003454", "end": "2024-09-20 13:54:28.370777", "rc": 0, "start": "2024-09-20 13:54:28.367323" } STDOUT: MII Polling Interval (ms): 110 11389 1726854868.42465: no more pending results, returning what we have 11389 1726854868.42469: results queue empty 11389 1726854868.42469: checking for any_errors_fatal 11389 1726854868.42477: done checking for any_errors_fatal 11389 1726854868.42478: checking for max_fail_percentage 11389 1726854868.42480: done checking for max_fail_percentage 11389 1726854868.42480: checking to see if all hosts have failed and the running result is not ok 11389 1726854868.42481: done checking to see if all hosts have failed 11389 1726854868.42482: getting the remaining hosts for this loop 11389 1726854868.42483: done getting the remaining hosts for this loop 11389 1726854868.42790: getting the next task for host managed_node3 11389 1726854868.42798: done getting next task for host managed_node3 11389 1726854868.42800: ^ task is: TASK: ** TEST check IPv4 11389 1726854868.42802: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854868.42806: getting variables 11389 1726854868.42808: in VariableManager get_vars() 11389 1726854868.42848: Calling all_inventory to load vars for managed_node3 11389 1726854868.42850: Calling groups_inventory to load vars for managed_node3 11389 1726854868.42853: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854868.43003: Calling all_plugins_play to load vars for managed_node3 11389 1726854868.43007: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854868.43010: Calling groups_plugins_play to load vars for managed_node3 11389 1726854868.43617: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000071 11389 1726854868.43621: WORKER PROCESS EXITING 11389 1726854868.44581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854868.46057: done with get_vars() 11389 1726854868.46086: done getting variables 11389 1726854868.46159: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:80 Friday 20 September 2024 13:54:28 -0400 (0:00:00.441) 0:00:20.884 ****** 11389 1726854868.46193: entering _queue_task() for managed_node3/command 11389 1726854868.46548: worker is 1 (out of 1 available) 11389 1726854868.46561: exiting _queue_task() for managed_node3/command 11389 1726854868.46572: done queuing things up, now waiting for results queue to drain 11389 1726854868.46578: waiting for pending results... 11389 1726854868.46868: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 11389 1726854868.46984: in run() - task 0affcc66-ac2b-deb8-c119-000000000072 11389 1726854868.47011: variable 'ansible_search_path' from source: unknown 11389 1726854868.47060: calling self._execute() 11389 1726854868.47177: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854868.47194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854868.47211: variable 'omit' from source: magic vars 11389 1726854868.47604: variable 'ansible_distribution_major_version' from source: facts 11389 1726854868.47620: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854868.47630: variable 'omit' from source: magic vars 11389 1726854868.47653: variable 'omit' from source: magic vars 11389 1726854868.47754: variable 'controller_device' from source: play vars 11389 1726854868.47784: variable 'omit' from source: magic vars 11389 1726854868.47894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854868.47898: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854868.47900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854868.47917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854868.47933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854868.47964: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854868.47972: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854868.47978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854868.48084: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854868.48100: Set connection var ansible_timeout to 10 11389 1726854868.48113: Set connection var ansible_connection to ssh 11389 1726854868.48122: Set connection var ansible_shell_type to sh 11389 1726854868.48131: Set connection var ansible_pipelining to False 11389 1726854868.48139: Set connection var ansible_shell_executable to /bin/sh 11389 1726854868.48162: variable 'ansible_shell_executable' from source: unknown 11389 1726854868.48219: variable 'ansible_connection' from source: unknown 11389 1726854868.48223: variable 'ansible_module_compression' from source: unknown 11389 1726854868.48225: variable 'ansible_shell_type' from source: unknown 11389 1726854868.48227: variable 'ansible_shell_executable' from source: unknown 11389 1726854868.48229: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854868.48231: variable 'ansible_pipelining' from source: unknown 11389 1726854868.48233: variable 'ansible_timeout' from source: unknown 11389 1726854868.48235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854868.48346: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854868.48364: variable 'omit' from source: magic vars 11389 1726854868.48373: starting attempt loop 11389 1726854868.48379: running the handler 11389 1726854868.48399: _low_level_execute_command(): starting 11389 1726854868.48411: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854868.49145: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854868.49163: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854868.49180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854868.49293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854868.49323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854868.49422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854868.51100: stdout chunk (state=3): >>>/root <<< 11389 1726854868.51235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854868.51270: stdout chunk (state=3): >>><<< 11389 1726854868.51273: stderr chunk (state=3): >>><<< 11389 1726854868.51384: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854868.51390: _low_level_execute_command(): starting 11389 1726854868.51393: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854868.5129588-12441-130603548779331 `" && echo ansible-tmp-1726854868.5129588-12441-130603548779331="` echo /root/.ansible/tmp/ansible-tmp-1726854868.5129588-12441-130603548779331 `" ) && sleep 0' 11389 1726854868.51958: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854868.51972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854868.51986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854868.52006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854868.52038: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11389 1726854868.52134: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 11389 1726854868.52153: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854868.52176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854868.52277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854868.54279: stdout chunk (state=3): >>>ansible-tmp-1726854868.5129588-12441-130603548779331=/root/.ansible/tmp/ansible-tmp-1726854868.5129588-12441-130603548779331 <<< 11389 1726854868.54428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854868.54438: stdout chunk (state=3): >>><<< 11389 1726854868.54448: stderr chunk (state=3): >>><<< 11389 1726854868.54470: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854868.5129588-12441-130603548779331=/root/.ansible/tmp/ansible-tmp-1726854868.5129588-12441-130603548779331 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854868.54514: variable 'ansible_module_compression' from source: unknown 11389 1726854868.54569: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11389 1726854868.54694: variable 'ansible_facts' from source: unknown 11389 1726854868.54700: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854868.5129588-12441-130603548779331/AnsiballZ_command.py 11389 1726854868.54931: Sending initial data 11389 1726854868.54940: Sent initial data (156 bytes) 11389 1726854868.55433: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854868.55447: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854868.55463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854868.55481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854868.55581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854868.55598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854868.55697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854868.57257: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11389 1726854868.57304: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854868.57366: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854868.57427: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmp8r7v5_vd /root/.ansible/tmp/ansible-tmp-1726854868.5129588-12441-130603548779331/AnsiballZ_command.py <<< 11389 1726854868.57431: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854868.5129588-12441-130603548779331/AnsiballZ_command.py" <<< 11389 1726854868.57491: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmp8r7v5_vd" to remote "/root/.ansible/tmp/ansible-tmp-1726854868.5129588-12441-130603548779331/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854868.5129588-12441-130603548779331/AnsiballZ_command.py" <<< 11389 1726854868.58356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854868.58498: stderr chunk (state=3): >>><<< 11389 1726854868.58501: stdout chunk (state=3): >>><<< 11389 1726854868.58503: done transferring module to remote 11389 1726854868.58505: _low_level_execute_command(): starting 11389 1726854868.58508: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854868.5129588-12441-130603548779331/ /root/.ansible/tmp/ansible-tmp-1726854868.5129588-12441-130603548779331/AnsiballZ_command.py && sleep 0' 11389 1726854868.59086: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854868.59107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854868.59145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854868.59243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854868.59271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854868.59363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854868.61168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854868.61291: stdout chunk (state=3): >>><<< 11389 1726854868.61295: stderr chunk (state=3): >>><<< 11389 1726854868.61299: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854868.61301: _low_level_execute_command(): starting 11389 1726854868.61304: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854868.5129588-12441-130603548779331/AnsiballZ_command.py && sleep 0' 11389 1726854868.61827: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854868.61841: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854868.61858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854868.61877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854868.61975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854868.62000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854868.62018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854868.62127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854868.77435: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.23/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 236sec preferred_lft 236sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 13:54:28.769927", "end": "2024-09-20 13:54:28.773510", "delta": "0:00:00.003583", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11389 1726854868.78928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854868.78954: stderr chunk (state=3): >>><<< 11389 1726854868.78958: stdout chunk (state=3): >>><<< 11389 1726854868.78973: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.23/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 236sec preferred_lft 236sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 13:54:28.769927", "end": "2024-09-20 13:54:28.773510", "delta": "0:00:00.003583", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854868.79004: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854868.5129588-12441-130603548779331/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854868.79011: _low_level_execute_command(): starting 11389 1726854868.79016: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854868.5129588-12441-130603548779331/ > /dev/null 2>&1 && sleep 0' 11389 1726854868.79736: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854868.79745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854868.79748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854868.79812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854868.81896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854868.81901: stdout chunk (state=3): >>><<< 11389 1726854868.81904: stderr chunk (state=3): >>><<< 11389 1726854868.81908: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854868.81911: handler run complete 11389 1726854868.81913: Evaluated conditional (False): False 11389 1726854868.81928: variable 'result' from source: set_fact 11389 1726854868.81945: Evaluated conditional ('192.0.2' in result.stdout): True 11389 1726854868.81958: attempt loop complete, returning result 11389 1726854868.81961: _execute() done 11389 1726854868.81963: dumping result to json 11389 1726854868.81973: done dumping result, returning 11389 1726854868.81980: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 [0affcc66-ac2b-deb8-c119-000000000072] 11389 1726854868.81986: sending task result for task 0affcc66-ac2b-deb8-c119-000000000072 11389 1726854868.82278: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000072 11389 1726854868.82282: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003583", "end": "2024-09-20 13:54:28.773510", "rc": 0, "start": "2024-09-20 13:54:28.769927" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.23/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 236sec preferred_lft 236sec 11389 1726854868.82377: no more pending results, returning what we have 11389 1726854868.82380: results queue empty 11389 1726854868.82381: checking for any_errors_fatal 11389 1726854868.82560: done checking for any_errors_fatal 11389 1726854868.82562: checking for max_fail_percentage 11389 1726854868.82564: done checking for max_fail_percentage 11389 1726854868.82567: checking to see if all hosts have failed and the running result is not ok 11389 1726854868.82569: done checking to see if all hosts have failed 11389 1726854868.82570: getting the remaining hosts for this loop 11389 1726854868.82571: done getting the remaining hosts for this loop 11389 1726854868.82575: getting the next task for host managed_node3 11389 1726854868.82582: done getting next task for host managed_node3 11389 1726854868.82586: ^ task is: TASK: ** TEST check IPv6 11389 1726854868.82590: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854868.82596: getting variables 11389 1726854868.82597: in VariableManager get_vars() 11389 1726854868.82642: Calling all_inventory to load vars for managed_node3 11389 1726854868.82646: Calling groups_inventory to load vars for managed_node3 11389 1726854868.82649: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854868.82904: Calling all_plugins_play to load vars for managed_node3 11389 1726854868.82909: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854868.82913: Calling groups_plugins_play to load vars for managed_node3 11389 1726854868.86503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854868.89075: done with get_vars() 11389 1726854868.89110: done getting variables 11389 1726854868.89229: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:87 Friday 20 September 2024 13:54:28 -0400 (0:00:00.430) 0:00:21.315 ****** 11389 1726854868.89301: entering _queue_task() for managed_node3/command 11389 1726854868.89770: worker is 1 (out of 1 available) 11389 1726854868.89783: exiting _queue_task() for managed_node3/command 11389 1726854868.89845: done queuing things up, now waiting for results queue to drain 11389 1726854868.89847: waiting for pending results... 11389 1726854868.90084: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 11389 1726854868.90177: in run() - task 0affcc66-ac2b-deb8-c119-000000000073 11389 1726854868.90196: variable 'ansible_search_path' from source: unknown 11389 1726854868.90241: calling self._execute() 11389 1726854868.90394: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854868.90399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854868.90402: variable 'omit' from source: magic vars 11389 1726854868.90795: variable 'ansible_distribution_major_version' from source: facts 11389 1726854868.90807: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854868.90813: variable 'omit' from source: magic vars 11389 1726854868.90840: variable 'omit' from source: magic vars 11389 1726854868.91092: variable 'controller_device' from source: play vars 11389 1726854868.91096: variable 'omit' from source: magic vars 11389 1726854868.91099: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854868.91101: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854868.91104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854868.91106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854868.91119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854868.91156: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854868.91159: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854868.91161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854868.91283: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854868.91290: Set connection var ansible_timeout to 10 11389 1726854868.91293: Set connection var ansible_connection to ssh 11389 1726854868.91298: Set connection var ansible_shell_type to sh 11389 1726854868.91304: Set connection var ansible_pipelining to False 11389 1726854868.91309: Set connection var ansible_shell_executable to /bin/sh 11389 1726854868.91337: variable 'ansible_shell_executable' from source: unknown 11389 1726854868.91340: variable 'ansible_connection' from source: unknown 11389 1726854868.91343: variable 'ansible_module_compression' from source: unknown 11389 1726854868.91345: variable 'ansible_shell_type' from source: unknown 11389 1726854868.91348: variable 'ansible_shell_executable' from source: unknown 11389 1726854868.91350: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854868.91352: variable 'ansible_pipelining' from source: unknown 11389 1726854868.91355: variable 'ansible_timeout' from source: unknown 11389 1726854868.91365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854868.91633: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854868.91644: variable 'omit' from source: magic vars 11389 1726854868.91650: starting attempt loop 11389 1726854868.91653: running the handler 11389 1726854868.91803: _low_level_execute_command(): starting 11389 1726854868.91814: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854868.93195: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854868.93200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854868.93204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854868.93421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854868.93439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854868.93600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854868.95294: stdout chunk (state=3): >>>/root <<< 11389 1726854868.95379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854868.95424: stderr chunk (state=3): >>><<< 11389 1726854868.95427: stdout chunk (state=3): >>><<< 11389 1726854868.95617: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854868.95631: _low_level_execute_command(): starting 11389 1726854868.95637: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854868.9561756-12456-64875727549883 `" && echo ansible-tmp-1726854868.9561756-12456-64875727549883="` echo /root/.ansible/tmp/ansible-tmp-1726854868.9561756-12456-64875727549883 `" ) && sleep 0' 11389 1726854868.96994: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854868.96999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854868.97007: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854868.97017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854868.97070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854868.97079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854868.97202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854868.99135: stdout chunk (state=3): >>>ansible-tmp-1726854868.9561756-12456-64875727549883=/root/.ansible/tmp/ansible-tmp-1726854868.9561756-12456-64875727549883 <<< 11389 1726854868.99406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854868.99411: stdout chunk (state=3): >>><<< 11389 1726854868.99416: stderr chunk (state=3): >>><<< 11389 1726854868.99438: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854868.9561756-12456-64875727549883=/root/.ansible/tmp/ansible-tmp-1726854868.9561756-12456-64875727549883 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854868.99476: variable 'ansible_module_compression' from source: unknown 11389 1726854868.99528: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11389 1726854868.99567: variable 'ansible_facts' from source: unknown 11389 1726854868.99983: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854868.9561756-12456-64875727549883/AnsiballZ_command.py 11389 1726854869.00753: Sending initial data 11389 1726854869.00757: Sent initial data (155 bytes) 11389 1726854869.01271: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854869.01285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854869.01304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854869.01409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854869.01424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854869.01440: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854869.01454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854869.01785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854869.03467: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854869.03528: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854869.03586: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmp6gkk4wcp /root/.ansible/tmp/ansible-tmp-1726854868.9561756-12456-64875727549883/AnsiballZ_command.py <<< 11389 1726854869.03592: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854868.9561756-12456-64875727549883/AnsiballZ_command.py" <<< 11389 1726854869.03667: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmp6gkk4wcp" to remote "/root/.ansible/tmp/ansible-tmp-1726854868.9561756-12456-64875727549883/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854868.9561756-12456-64875727549883/AnsiballZ_command.py" <<< 11389 1726854869.05220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854869.05245: stderr chunk (state=3): >>><<< 11389 1726854869.05249: stdout chunk (state=3): >>><<< 11389 1726854869.05309: done transferring module to remote 11389 1726854869.05320: _low_level_execute_command(): starting 11389 1726854869.05325: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854868.9561756-12456-64875727549883/ /root/.ansible/tmp/ansible-tmp-1726854868.9561756-12456-64875727549883/AnsiballZ_command.py && sleep 0' 11389 1726854869.06786: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854869.06794: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854869.06833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854869.06837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854869.06916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854869.09084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854869.09294: stdout chunk (state=3): >>><<< 11389 1726854869.09298: stderr chunk (state=3): >>><<< 11389 1726854869.09300: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854869.09303: _low_level_execute_command(): starting 11389 1726854869.09305: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854868.9561756-12456-64875727549883/AnsiballZ_command.py && sleep 0' 11389 1726854869.10416: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854869.10621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 11389 1726854869.10626: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11389 1726854869.10629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854869.10632: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854869.10688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854869.10720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854869.10812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854869.10815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854869.26910: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::22/128 scope global dynamic noprefixroute \n valid_lft 237sec preferred_lft 237sec\n inet6 2001:db8::f060:7ff:fea9:bf2b/64 scope global dynamic noprefixroute \n valid_lft 1796sec preferred_lft 1796sec\n inet6 fe80::f060:7ff:fea9:bf2b/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 13:54:29.264403", "end": "2024-09-20 13:54:29.268127", "delta": "0:00:00.003724", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11389 1726854869.28681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854869.28685: stdout chunk (state=3): >>><<< 11389 1726854869.28691: stderr chunk (state=3): >>><<< 11389 1726854869.28718: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::22/128 scope global dynamic noprefixroute \n valid_lft 237sec preferred_lft 237sec\n inet6 2001:db8::f060:7ff:fea9:bf2b/64 scope global dynamic noprefixroute \n valid_lft 1796sec preferred_lft 1796sec\n inet6 fe80::f060:7ff:fea9:bf2b/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 13:54:29.264403", "end": "2024-09-20 13:54:29.268127", "delta": "0:00:00.003724", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854869.28756: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854868.9561756-12456-64875727549883/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854869.28764: _low_level_execute_command(): starting 11389 1726854869.28769: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854868.9561756-12456-64875727549883/ > /dev/null 2>&1 && sleep 0' 11389 1726854869.29363: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854869.29373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854869.29381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854869.29392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854869.29498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854869.29506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854869.29596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854869.31593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854869.31596: stderr chunk (state=3): >>><<< 11389 1726854869.31599: stdout chunk (state=3): >>><<< 11389 1726854869.31602: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854869.31604: handler run complete 11389 1726854869.31606: Evaluated conditional (False): False 11389 1726854869.31780: variable 'result' from source: set_fact 11389 1726854869.31801: Evaluated conditional ('2001' in result.stdout): True 11389 1726854869.31813: attempt loop complete, returning result 11389 1726854869.31816: _execute() done 11389 1726854869.31818: dumping result to json 11389 1726854869.31824: done dumping result, returning 11389 1726854869.31833: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 [0affcc66-ac2b-deb8-c119-000000000073] 11389 1726854869.31838: sending task result for task 0affcc66-ac2b-deb8-c119-000000000073 11389 1726854869.31944: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000073 11389 1726854869.31947: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003724", "end": "2024-09-20 13:54:29.268127", "rc": 0, "start": "2024-09-20 13:54:29.264403" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::22/128 scope global dynamic noprefixroute valid_lft 237sec preferred_lft 237sec inet6 2001:db8::f060:7ff:fea9:bf2b/64 scope global dynamic noprefixroute valid_lft 1796sec preferred_lft 1796sec inet6 fe80::f060:7ff:fea9:bf2b/64 scope link noprefixroute valid_lft forever preferred_lft forever 11389 1726854869.32084: no more pending results, returning what we have 11389 1726854869.32092: results queue empty 11389 1726854869.32093: checking for any_errors_fatal 11389 1726854869.32101: done checking for any_errors_fatal 11389 1726854869.32102: checking for max_fail_percentage 11389 1726854869.32106: done checking for max_fail_percentage 11389 1726854869.32106: checking to see if all hosts have failed and the running result is not ok 11389 1726854869.32108: done checking to see if all hosts have failed 11389 1726854869.32108: getting the remaining hosts for this loop 11389 1726854869.32110: done getting the remaining hosts for this loop 11389 1726854869.32113: getting the next task for host managed_node3 11389 1726854869.32125: done getting next task for host managed_node3 11389 1726854869.32131: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11389 1726854869.32136: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854869.32155: getting variables 11389 1726854869.32157: in VariableManager get_vars() 11389 1726854869.32321: Calling all_inventory to load vars for managed_node3 11389 1726854869.32324: Calling groups_inventory to load vars for managed_node3 11389 1726854869.32327: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854869.32338: Calling all_plugins_play to load vars for managed_node3 11389 1726854869.32341: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854869.32344: Calling groups_plugins_play to load vars for managed_node3 11389 1726854869.33938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854869.35849: done with get_vars() 11389 1726854869.35883: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:54:29 -0400 (0:00:00.467) 0:00:21.783 ****** 11389 1726854869.36022: entering _queue_task() for managed_node3/include_tasks 11389 1726854869.36465: worker is 1 (out of 1 available) 11389 1726854869.36481: exiting _queue_task() for managed_node3/include_tasks 11389 1726854869.36496: done queuing things up, now waiting for results queue to drain 11389 1726854869.36498: waiting for pending results... 11389 1726854869.36902: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11389 1726854869.37544: in run() - task 0affcc66-ac2b-deb8-c119-00000000007c 11389 1726854869.37549: variable 'ansible_search_path' from source: unknown 11389 1726854869.37552: variable 'ansible_search_path' from source: unknown 11389 1726854869.37555: calling self._execute() 11389 1726854869.37558: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854869.37561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854869.37564: variable 'omit' from source: magic vars 11389 1726854869.38208: variable 'ansible_distribution_major_version' from source: facts 11389 1726854869.38308: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854869.38322: _execute() done 11389 1726854869.38331: dumping result to json 11389 1726854869.38339: done dumping result, returning 11389 1726854869.38350: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-deb8-c119-00000000007c] 11389 1726854869.38794: sending task result for task 0affcc66-ac2b-deb8-c119-00000000007c 11389 1726854869.38867: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000007c 11389 1726854869.38871: WORKER PROCESS EXITING 11389 1726854869.38914: no more pending results, returning what we have 11389 1726854869.38921: in VariableManager get_vars() 11389 1726854869.38960: Calling all_inventory to load vars for managed_node3 11389 1726854869.38963: Calling groups_inventory to load vars for managed_node3 11389 1726854869.38965: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854869.38976: Calling all_plugins_play to load vars for managed_node3 11389 1726854869.38979: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854869.39094: Calling groups_plugins_play to load vars for managed_node3 11389 1726854869.42186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854869.43891: done with get_vars() 11389 1726854869.43923: variable 'ansible_search_path' from source: unknown 11389 1726854869.43924: variable 'ansible_search_path' from source: unknown 11389 1726854869.43972: we have included files to process 11389 1726854869.43974: generating all_blocks data 11389 1726854869.43976: done generating all_blocks data 11389 1726854869.43981: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11389 1726854869.43982: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11389 1726854869.43985: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11389 1726854869.44646: done processing included file 11389 1726854869.44649: iterating over new_blocks loaded from include file 11389 1726854869.44651: in VariableManager get_vars() 11389 1726854869.44681: done with get_vars() 11389 1726854869.44683: filtering new block on tags 11389 1726854869.44718: done filtering new block on tags 11389 1726854869.44721: in VariableManager get_vars() 11389 1726854869.44746: done with get_vars() 11389 1726854869.44748: filtering new block on tags 11389 1726854869.44792: done filtering new block on tags 11389 1726854869.44794: in VariableManager get_vars() 11389 1726854869.44818: done with get_vars() 11389 1726854869.44820: filtering new block on tags 11389 1726854869.44859: done filtering new block on tags 11389 1726854869.44861: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 11389 1726854869.44866: extending task lists for all hosts with included blocks 11389 1726854869.45941: done extending task lists 11389 1726854869.45943: done processing included files 11389 1726854869.45943: results queue empty 11389 1726854869.45944: checking for any_errors_fatal 11389 1726854869.45948: done checking for any_errors_fatal 11389 1726854869.45949: checking for max_fail_percentage 11389 1726854869.45950: done checking for max_fail_percentage 11389 1726854869.45951: checking to see if all hosts have failed and the running result is not ok 11389 1726854869.45952: done checking to see if all hosts have failed 11389 1726854869.45953: getting the remaining hosts for this loop 11389 1726854869.45954: done getting the remaining hosts for this loop 11389 1726854869.45956: getting the next task for host managed_node3 11389 1726854869.45961: done getting next task for host managed_node3 11389 1726854869.45963: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11389 1726854869.45967: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854869.45977: getting variables 11389 1726854869.45978: in VariableManager get_vars() 11389 1726854869.45996: Calling all_inventory to load vars for managed_node3 11389 1726854869.45998: Calling groups_inventory to load vars for managed_node3 11389 1726854869.46000: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854869.46005: Calling all_plugins_play to load vars for managed_node3 11389 1726854869.46007: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854869.46009: Calling groups_plugins_play to load vars for managed_node3 11389 1726854869.47098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854869.48631: done with get_vars() 11389 1726854869.48655: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:54:29 -0400 (0:00:00.127) 0:00:21.910 ****** 11389 1726854869.48736: entering _queue_task() for managed_node3/setup 11389 1726854869.49091: worker is 1 (out of 1 available) 11389 1726854869.49106: exiting _queue_task() for managed_node3/setup 11389 1726854869.49119: done queuing things up, now waiting for results queue to drain 11389 1726854869.49121: waiting for pending results... 11389 1726854869.49377: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11389 1726854869.49530: in run() - task 0affcc66-ac2b-deb8-c119-000000000491 11389 1726854869.49545: variable 'ansible_search_path' from source: unknown 11389 1726854869.49549: variable 'ansible_search_path' from source: unknown 11389 1726854869.49586: calling self._execute() 11389 1726854869.49683: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854869.49689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854869.49700: variable 'omit' from source: magic vars 11389 1726854869.50067: variable 'ansible_distribution_major_version' from source: facts 11389 1726854869.50081: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854869.50299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854869.52644: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854869.52734: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854869.52792: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854869.52835: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854869.52877: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854869.52968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854869.53185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854869.53190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854869.53527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854869.53530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854869.53533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854869.53538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854869.53569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854869.53618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854869.53752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854869.53931: variable '__network_required_facts' from source: role '' defaults 11389 1726854869.53946: variable 'ansible_facts' from source: unknown 11389 1726854869.54733: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11389 1726854869.54743: when evaluation is False, skipping this task 11389 1726854869.54750: _execute() done 11389 1726854869.54757: dumping result to json 11389 1726854869.54764: done dumping result, returning 11389 1726854869.54776: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcc66-ac2b-deb8-c119-000000000491] 11389 1726854869.54786: sending task result for task 0affcc66-ac2b-deb8-c119-000000000491 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11389 1726854869.54982: no more pending results, returning what we have 11389 1726854869.54986: results queue empty 11389 1726854869.54989: checking for any_errors_fatal 11389 1726854869.54991: done checking for any_errors_fatal 11389 1726854869.54991: checking for max_fail_percentage 11389 1726854869.54993: done checking for max_fail_percentage 11389 1726854869.54994: checking to see if all hosts have failed and the running result is not ok 11389 1726854869.54995: done checking to see if all hosts have failed 11389 1726854869.54996: getting the remaining hosts for this loop 11389 1726854869.54997: done getting the remaining hosts for this loop 11389 1726854869.55001: getting the next task for host managed_node3 11389 1726854869.55011: done getting next task for host managed_node3 11389 1726854869.55015: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11389 1726854869.55021: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854869.55038: getting variables 11389 1726854869.55040: in VariableManager get_vars() 11389 1726854869.55191: Calling all_inventory to load vars for managed_node3 11389 1726854869.55195: Calling groups_inventory to load vars for managed_node3 11389 1726854869.55203: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854869.55210: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000491 11389 1726854869.55213: WORKER PROCESS EXITING 11389 1726854869.55224: Calling all_plugins_play to load vars for managed_node3 11389 1726854869.55228: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854869.55232: Calling groups_plugins_play to load vars for managed_node3 11389 1726854869.56845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854869.58442: done with get_vars() 11389 1726854869.58474: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:54:29 -0400 (0:00:00.098) 0:00:22.008 ****** 11389 1726854869.58596: entering _queue_task() for managed_node3/stat 11389 1726854869.58941: worker is 1 (out of 1 available) 11389 1726854869.58954: exiting _queue_task() for managed_node3/stat 11389 1726854869.58964: done queuing things up, now waiting for results queue to drain 11389 1726854869.58968: waiting for pending results... 11389 1726854869.59263: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 11389 1726854869.59448: in run() - task 0affcc66-ac2b-deb8-c119-000000000493 11389 1726854869.59475: variable 'ansible_search_path' from source: unknown 11389 1726854869.59484: variable 'ansible_search_path' from source: unknown 11389 1726854869.59532: calling self._execute() 11389 1726854869.59635: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854869.59646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854869.59660: variable 'omit' from source: magic vars 11389 1726854869.60040: variable 'ansible_distribution_major_version' from source: facts 11389 1726854869.60056: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854869.60287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854869.60536: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854869.60584: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854869.60631: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854869.60671: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854869.60771: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854869.60804: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854869.60837: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854869.60936: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854869.60975: variable '__network_is_ostree' from source: set_fact 11389 1726854869.60988: Evaluated conditional (not __network_is_ostree is defined): False 11389 1726854869.60996: when evaluation is False, skipping this task 11389 1726854869.61004: _execute() done 11389 1726854869.61010: dumping result to json 11389 1726854869.61018: done dumping result, returning 11389 1726854869.61030: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcc66-ac2b-deb8-c119-000000000493] 11389 1726854869.61044: sending task result for task 0affcc66-ac2b-deb8-c119-000000000493 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11389 1726854869.61212: no more pending results, returning what we have 11389 1726854869.61217: results queue empty 11389 1726854869.61217: checking for any_errors_fatal 11389 1726854869.61225: done checking for any_errors_fatal 11389 1726854869.61226: checking for max_fail_percentage 11389 1726854869.61228: done checking for max_fail_percentage 11389 1726854869.61228: checking to see if all hosts have failed and the running result is not ok 11389 1726854869.61230: done checking to see if all hosts have failed 11389 1726854869.61230: getting the remaining hosts for this loop 11389 1726854869.61232: done getting the remaining hosts for this loop 11389 1726854869.61235: getting the next task for host managed_node3 11389 1726854869.61243: done getting next task for host managed_node3 11389 1726854869.61247: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11389 1726854869.61252: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854869.61274: getting variables 11389 1726854869.61276: in VariableManager get_vars() 11389 1726854869.61322: Calling all_inventory to load vars for managed_node3 11389 1726854869.61325: Calling groups_inventory to load vars for managed_node3 11389 1726854869.61328: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854869.61339: Calling all_plugins_play to load vars for managed_node3 11389 1726854869.61343: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854869.61346: Calling groups_plugins_play to load vars for managed_node3 11389 1726854869.62394: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000493 11389 1726854869.62397: WORKER PROCESS EXITING 11389 1726854869.64801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854869.67785: done with get_vars() 11389 1726854869.68021: done getting variables 11389 1726854869.68090: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:54:29 -0400 (0:00:00.095) 0:00:22.104 ****** 11389 1726854869.68132: entering _queue_task() for managed_node3/set_fact 11389 1726854869.68880: worker is 1 (out of 1 available) 11389 1726854869.68897: exiting _queue_task() for managed_node3/set_fact 11389 1726854869.68910: done queuing things up, now waiting for results queue to drain 11389 1726854869.68911: waiting for pending results... 11389 1726854869.69298: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11389 1726854869.69472: in run() - task 0affcc66-ac2b-deb8-c119-000000000494 11389 1726854869.69502: variable 'ansible_search_path' from source: unknown 11389 1726854869.69510: variable 'ansible_search_path' from source: unknown 11389 1726854869.69550: calling self._execute() 11389 1726854869.69656: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854869.69671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854869.69689: variable 'omit' from source: magic vars 11389 1726854869.70085: variable 'ansible_distribution_major_version' from source: facts 11389 1726854869.70105: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854869.70291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854869.70577: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854869.70631: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854869.70675: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854869.70719: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854869.70815: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854869.70846: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854869.70881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854869.70993: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854869.71012: variable '__network_is_ostree' from source: set_fact 11389 1726854869.71030: Evaluated conditional (not __network_is_ostree is defined): False 11389 1726854869.71037: when evaluation is False, skipping this task 11389 1726854869.71044: _execute() done 11389 1726854869.71050: dumping result to json 11389 1726854869.71060: done dumping result, returning 11389 1726854869.71075: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcc66-ac2b-deb8-c119-000000000494] 11389 1726854869.71086: sending task result for task 0affcc66-ac2b-deb8-c119-000000000494 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11389 1726854869.71337: no more pending results, returning what we have 11389 1726854869.71341: results queue empty 11389 1726854869.71342: checking for any_errors_fatal 11389 1726854869.71351: done checking for any_errors_fatal 11389 1726854869.71352: checking for max_fail_percentage 11389 1726854869.71354: done checking for max_fail_percentage 11389 1726854869.71354: checking to see if all hosts have failed and the running result is not ok 11389 1726854869.71356: done checking to see if all hosts have failed 11389 1726854869.71356: getting the remaining hosts for this loop 11389 1726854869.71358: done getting the remaining hosts for this loop 11389 1726854869.71362: getting the next task for host managed_node3 11389 1726854869.71376: done getting next task for host managed_node3 11389 1726854869.71380: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11389 1726854869.71386: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854869.71406: getting variables 11389 1726854869.71408: in VariableManager get_vars() 11389 1726854869.71452: Calling all_inventory to load vars for managed_node3 11389 1726854869.71456: Calling groups_inventory to load vars for managed_node3 11389 1726854869.71459: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854869.71472: Calling all_plugins_play to load vars for managed_node3 11389 1726854869.71476: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854869.71480: Calling groups_plugins_play to load vars for managed_node3 11389 1726854869.71493: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000494 11389 1726854869.71497: WORKER PROCESS EXITING 11389 1726854869.73096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854869.74700: done with get_vars() 11389 1726854869.74728: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:54:29 -0400 (0:00:00.067) 0:00:22.171 ****** 11389 1726854869.74835: entering _queue_task() for managed_node3/service_facts 11389 1726854869.75183: worker is 1 (out of 1 available) 11389 1726854869.75397: exiting _queue_task() for managed_node3/service_facts 11389 1726854869.75407: done queuing things up, now waiting for results queue to drain 11389 1726854869.75409: waiting for pending results... 11389 1726854869.75506: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 11389 1726854869.75693: in run() - task 0affcc66-ac2b-deb8-c119-000000000496 11389 1726854869.75716: variable 'ansible_search_path' from source: unknown 11389 1726854869.75723: variable 'ansible_search_path' from source: unknown 11389 1726854869.75769: calling self._execute() 11389 1726854869.75882: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854869.75898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854869.75914: variable 'omit' from source: magic vars 11389 1726854869.76296: variable 'ansible_distribution_major_version' from source: facts 11389 1726854869.76311: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854869.76320: variable 'omit' from source: magic vars 11389 1726854869.76405: variable 'omit' from source: magic vars 11389 1726854869.76443: variable 'omit' from source: magic vars 11389 1726854869.76489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854869.76534: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854869.76557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854869.76581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854869.76599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854869.76637: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854869.76645: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854869.76652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854869.76835: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854869.76838: Set connection var ansible_timeout to 10 11389 1726854869.76840: Set connection var ansible_connection to ssh 11389 1726854869.76842: Set connection var ansible_shell_type to sh 11389 1726854869.76844: Set connection var ansible_pipelining to False 11389 1726854869.76846: Set connection var ansible_shell_executable to /bin/sh 11389 1726854869.76848: variable 'ansible_shell_executable' from source: unknown 11389 1726854869.76849: variable 'ansible_connection' from source: unknown 11389 1726854869.76851: variable 'ansible_module_compression' from source: unknown 11389 1726854869.76853: variable 'ansible_shell_type' from source: unknown 11389 1726854869.76855: variable 'ansible_shell_executable' from source: unknown 11389 1726854869.76856: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854869.76858: variable 'ansible_pipelining' from source: unknown 11389 1726854869.76860: variable 'ansible_timeout' from source: unknown 11389 1726854869.76861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854869.77051: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11389 1726854869.77074: variable 'omit' from source: magic vars 11389 1726854869.77084: starting attempt loop 11389 1726854869.77093: running the handler 11389 1726854869.77109: _low_level_execute_command(): starting 11389 1726854869.77119: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854869.77907: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854869.77981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854869.78158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854869.78249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854869.79963: stdout chunk (state=3): >>>/root <<< 11389 1726854869.80400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854869.80403: stdout chunk (state=3): >>><<< 11389 1726854869.80406: stderr chunk (state=3): >>><<< 11389 1726854869.80409: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854869.80412: _low_level_execute_command(): starting 11389 1726854869.80414: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854869.8030875-12489-15581394558782 `" && echo ansible-tmp-1726854869.8030875-12489-15581394558782="` echo /root/.ansible/tmp/ansible-tmp-1726854869.8030875-12489-15581394558782 `" ) && sleep 0' 11389 1726854869.81806: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854869.81898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854869.81955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854869.82067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854869.84079: stdout chunk (state=3): >>>ansible-tmp-1726854869.8030875-12489-15581394558782=/root/.ansible/tmp/ansible-tmp-1726854869.8030875-12489-15581394558782 <<< 11389 1726854869.84180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854869.84258: stderr chunk (state=3): >>><<< 11389 1726854869.84262: stdout chunk (state=3): >>><<< 11389 1726854869.84281: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854869.8030875-12489-15581394558782=/root/.ansible/tmp/ansible-tmp-1726854869.8030875-12489-15581394558782 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854869.84594: variable 'ansible_module_compression' from source: unknown 11389 1726854869.84598: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 11389 1726854869.84600: variable 'ansible_facts' from source: unknown 11389 1726854869.84820: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854869.8030875-12489-15581394558782/AnsiballZ_service_facts.py 11389 1726854869.85120: Sending initial data 11389 1726854869.85131: Sent initial data (161 bytes) 11389 1726854869.86509: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854869.86616: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11389 1726854869.86627: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 11389 1726854869.86715: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854869.86855: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854869.86912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854869.88605: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854869.88702: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854869.88770: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmpx4egbb3u /root/.ansible/tmp/ansible-tmp-1726854869.8030875-12489-15581394558782/AnsiballZ_service_facts.py <<< 11389 1726854869.88782: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854869.8030875-12489-15581394558782/AnsiballZ_service_facts.py" <<< 11389 1726854869.88826: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmpx4egbb3u" to remote "/root/.ansible/tmp/ansible-tmp-1726854869.8030875-12489-15581394558782/AnsiballZ_service_facts.py" <<< 11389 1726854869.88838: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854869.8030875-12489-15581394558782/AnsiballZ_service_facts.py" <<< 11389 1726854869.90450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854869.90504: stderr chunk (state=3): >>><<< 11389 1726854869.90513: stdout chunk (state=3): >>><<< 11389 1726854869.90791: done transferring module to remote 11389 1726854869.90796: _low_level_execute_command(): starting 11389 1726854869.90798: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854869.8030875-12489-15581394558782/ /root/.ansible/tmp/ansible-tmp-1726854869.8030875-12489-15581394558782/AnsiballZ_service_facts.py && sleep 0' 11389 1726854869.91839: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854869.91843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854869.91845: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 11389 1726854869.91847: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854869.91850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854869.92019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854869.92310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854869.92432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854869.94494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854869.94499: stdout chunk (state=3): >>><<< 11389 1726854869.94501: stderr chunk (state=3): >>><<< 11389 1726854869.94509: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854869.94513: _low_level_execute_command(): starting 11389 1726854869.94516: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854869.8030875-12489-15581394558782/AnsiballZ_service_facts.py && sleep 0' 11389 1726854869.95808: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854869.95813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854869.95964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854871.48358: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 11389 1726854871.48374: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 11389 1726854871.48380: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11389 1726854871.49878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854871.49905: stderr chunk (state=3): >>><<< 11389 1726854871.49908: stdout chunk (state=3): >>><<< 11389 1726854871.49933: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854871.50597: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854869.8030875-12489-15581394558782/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854871.50610: _low_level_execute_command(): starting 11389 1726854871.50613: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854869.8030875-12489-15581394558782/ > /dev/null 2>&1 && sleep 0' 11389 1726854871.51066: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854871.51070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854871.51072: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11389 1726854871.51074: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854871.51076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854871.51127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854871.51134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854871.51136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854871.51195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854871.53008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854871.53034: stderr chunk (state=3): >>><<< 11389 1726854871.53037: stdout chunk (state=3): >>><<< 11389 1726854871.53048: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854871.53054: handler run complete 11389 1726854871.53165: variable 'ansible_facts' from source: unknown 11389 1726854871.53258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854871.53531: variable 'ansible_facts' from source: unknown 11389 1726854871.53609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854871.53722: attempt loop complete, returning result 11389 1726854871.53725: _execute() done 11389 1726854871.53728: dumping result to json 11389 1726854871.53762: done dumping result, returning 11389 1726854871.53774: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcc66-ac2b-deb8-c119-000000000496] 11389 1726854871.53778: sending task result for task 0affcc66-ac2b-deb8-c119-000000000496 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11389 1726854871.54600: no more pending results, returning what we have 11389 1726854871.54603: results queue empty 11389 1726854871.54604: checking for any_errors_fatal 11389 1726854871.54606: done checking for any_errors_fatal 11389 1726854871.54607: checking for max_fail_percentage 11389 1726854871.54609: done checking for max_fail_percentage 11389 1726854871.54609: checking to see if all hosts have failed and the running result is not ok 11389 1726854871.54610: done checking to see if all hosts have failed 11389 1726854871.54611: getting the remaining hosts for this loop 11389 1726854871.54612: done getting the remaining hosts for this loop 11389 1726854871.54621: getting the next task for host managed_node3 11389 1726854871.54626: done getting next task for host managed_node3 11389 1726854871.54629: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11389 1726854871.54635: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854871.54643: getting variables 11389 1726854871.54644: in VariableManager get_vars() 11389 1726854871.54673: Calling all_inventory to load vars for managed_node3 11389 1726854871.54676: Calling groups_inventory to load vars for managed_node3 11389 1726854871.54678: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854871.54686: Calling all_plugins_play to load vars for managed_node3 11389 1726854871.54690: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854871.54693: Calling groups_plugins_play to load vars for managed_node3 11389 1726854871.55239: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000496 11389 1726854871.55242: WORKER PROCESS EXITING 11389 1726854871.56103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854871.57804: done with get_vars() 11389 1726854871.57834: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:54:31 -0400 (0:00:01.831) 0:00:24.002 ****** 11389 1726854871.57944: entering _queue_task() for managed_node3/package_facts 11389 1726854871.58393: worker is 1 (out of 1 available) 11389 1726854871.58405: exiting _queue_task() for managed_node3/package_facts 11389 1726854871.58418: done queuing things up, now waiting for results queue to drain 11389 1726854871.58420: waiting for pending results... 11389 1726854871.58648: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 11389 1726854871.58859: in run() - task 0affcc66-ac2b-deb8-c119-000000000497 11389 1726854871.58864: variable 'ansible_search_path' from source: unknown 11389 1726854871.58869: variable 'ansible_search_path' from source: unknown 11389 1726854871.58900: calling self._execute() 11389 1726854871.59078: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854871.59081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854871.59084: variable 'omit' from source: magic vars 11389 1726854871.59413: variable 'ansible_distribution_major_version' from source: facts 11389 1726854871.59428: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854871.59438: variable 'omit' from source: magic vars 11389 1726854871.59531: variable 'omit' from source: magic vars 11389 1726854871.59573: variable 'omit' from source: magic vars 11389 1726854871.59623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854871.59660: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854871.59688: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854871.59710: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854871.59732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854871.59763: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854871.59774: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854871.59837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854871.59896: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854871.59909: Set connection var ansible_timeout to 10 11389 1726854871.59916: Set connection var ansible_connection to ssh 11389 1726854871.59924: Set connection var ansible_shell_type to sh 11389 1726854871.59932: Set connection var ansible_pipelining to False 11389 1726854871.59947: Set connection var ansible_shell_executable to /bin/sh 11389 1726854871.59974: variable 'ansible_shell_executable' from source: unknown 11389 1726854871.59982: variable 'ansible_connection' from source: unknown 11389 1726854871.59991: variable 'ansible_module_compression' from source: unknown 11389 1726854871.59998: variable 'ansible_shell_type' from source: unknown 11389 1726854871.60004: variable 'ansible_shell_executable' from source: unknown 11389 1726854871.60010: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854871.60054: variable 'ansible_pipelining' from source: unknown 11389 1726854871.60057: variable 'ansible_timeout' from source: unknown 11389 1726854871.60059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854871.60256: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11389 1726854871.60281: variable 'omit' from source: magic vars 11389 1726854871.60293: starting attempt loop 11389 1726854871.60301: running the handler 11389 1726854871.60319: _low_level_execute_command(): starting 11389 1726854871.60383: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854871.61062: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854871.61077: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854871.61093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854871.61110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854871.61124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854871.61191: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854871.61244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854871.61268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854871.61310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854871.61411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854871.63164: stdout chunk (state=3): >>>/root <<< 11389 1726854871.63203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854871.63240: stderr chunk (state=3): >>><<< 11389 1726854871.63258: stdout chunk (state=3): >>><<< 11389 1726854871.63277: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854871.63376: _low_level_execute_command(): starting 11389 1726854871.63380: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854871.6328416-12554-218088607389959 `" && echo ansible-tmp-1726854871.6328416-12554-218088607389959="` echo /root/.ansible/tmp/ansible-tmp-1726854871.6328416-12554-218088607389959 `" ) && sleep 0' 11389 1726854871.63921: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854871.63937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854871.63950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854871.63968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854871.63998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854871.64010: stderr chunk (state=3): >>>debug2: match not found <<< 11389 1726854871.64023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854871.64110: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854871.64132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854871.64149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854871.64173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854871.64273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854871.66190: stdout chunk (state=3): >>>ansible-tmp-1726854871.6328416-12554-218088607389959=/root/.ansible/tmp/ansible-tmp-1726854871.6328416-12554-218088607389959 <<< 11389 1726854871.66370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854871.66374: stdout chunk (state=3): >>><<< 11389 1726854871.66376: stderr chunk (state=3): >>><<< 11389 1726854871.66397: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854871.6328416-12554-218088607389959=/root/.ansible/tmp/ansible-tmp-1726854871.6328416-12554-218088607389959 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854871.66599: variable 'ansible_module_compression' from source: unknown 11389 1726854871.66602: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 11389 1726854871.66605: variable 'ansible_facts' from source: unknown 11389 1726854871.66775: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854871.6328416-12554-218088607389959/AnsiballZ_package_facts.py 11389 1726854871.66940: Sending initial data 11389 1726854871.66949: Sent initial data (162 bytes) 11389 1726854871.67734: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854871.67742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854871.67754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854871.67779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854871.67793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854871.67913: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854871.68102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854871.68142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854871.68153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854871.68404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854871.69896: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11389 1726854871.69910: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11389 1726854871.69919: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11389 1726854871.69929: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 11389 1726854871.69939: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 11389 1726854871.69953: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854871.70031: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854871.70114: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmp3vf83uku /root/.ansible/tmp/ansible-tmp-1726854871.6328416-12554-218088607389959/AnsiballZ_package_facts.py <<< 11389 1726854871.70123: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854871.6328416-12554-218088607389959/AnsiballZ_package_facts.py" <<< 11389 1726854871.70162: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmp3vf83uku" to remote "/root/.ansible/tmp/ansible-tmp-1726854871.6328416-12554-218088607389959/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854871.6328416-12554-218088607389959/AnsiballZ_package_facts.py" <<< 11389 1726854871.72245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854871.72249: stdout chunk (state=3): >>><<< 11389 1726854871.72251: stderr chunk (state=3): >>><<< 11389 1726854871.72255: done transferring module to remote 11389 1726854871.72257: _low_level_execute_command(): starting 11389 1726854871.72259: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854871.6328416-12554-218088607389959/ /root/.ansible/tmp/ansible-tmp-1726854871.6328416-12554-218088607389959/AnsiballZ_package_facts.py && sleep 0' 11389 1726854871.72792: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854871.72994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854871.72999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854871.73002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854871.73032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854871.74869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854871.74879: stdout chunk (state=3): >>><<< 11389 1726854871.74893: stderr chunk (state=3): >>><<< 11389 1726854871.74911: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854871.74919: _low_level_execute_command(): starting 11389 1726854871.74927: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854871.6328416-12554-218088607389959/AnsiballZ_package_facts.py && sleep 0' 11389 1726854871.76204: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854871.76218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 11389 1726854871.76229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854871.76293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854871.76394: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854871.76526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854871.76582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854872.20501: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 11389 1726854872.20522: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 11389 1726854872.20586: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 11389 1726854872.20606: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 11389 1726854872.20615: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 11389 1726854872.20618: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 11389 1726854872.20639: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 11389 1726854872.20659: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 11389 1726854872.20671: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11389 1726854872.22466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854872.22501: stderr chunk (state=3): >>><<< 11389 1726854872.22504: stdout chunk (state=3): >>><<< 11389 1726854872.22539: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854872.23850: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854871.6328416-12554-218088607389959/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854872.23873: _low_level_execute_command(): starting 11389 1726854872.23890: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854871.6328416-12554-218088607389959/ > /dev/null 2>&1 && sleep 0' 11389 1726854872.24352: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854872.24355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854872.24357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854872.24359: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854872.24361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854872.24420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854872.24428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854872.24431: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854872.24490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854872.26354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854872.26379: stderr chunk (state=3): >>><<< 11389 1726854872.26383: stdout chunk (state=3): >>><<< 11389 1726854872.26400: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854872.26403: handler run complete 11389 1726854872.26877: variable 'ansible_facts' from source: unknown 11389 1726854872.27190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854872.28793: variable 'ansible_facts' from source: unknown 11389 1726854872.29004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854872.29718: attempt loop complete, returning result 11389 1726854872.29736: _execute() done 11389 1726854872.29747: dumping result to json 11389 1726854872.29961: done dumping result, returning 11389 1726854872.29980: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcc66-ac2b-deb8-c119-000000000497] 11389 1726854872.29992: sending task result for task 0affcc66-ac2b-deb8-c119-000000000497 11389 1726854872.32471: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000497 11389 1726854872.32474: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11389 1726854872.32622: no more pending results, returning what we have 11389 1726854872.32626: results queue empty 11389 1726854872.32627: checking for any_errors_fatal 11389 1726854872.32636: done checking for any_errors_fatal 11389 1726854872.32637: checking for max_fail_percentage 11389 1726854872.32639: done checking for max_fail_percentage 11389 1726854872.32639: checking to see if all hosts have failed and the running result is not ok 11389 1726854872.32640: done checking to see if all hosts have failed 11389 1726854872.32641: getting the remaining hosts for this loop 11389 1726854872.32642: done getting the remaining hosts for this loop 11389 1726854872.32646: getting the next task for host managed_node3 11389 1726854872.32652: done getting next task for host managed_node3 11389 1726854872.32657: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11389 1726854872.32661: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854872.32675: getting variables 11389 1726854872.32677: in VariableManager get_vars() 11389 1726854872.32712: Calling all_inventory to load vars for managed_node3 11389 1726854872.32715: Calling groups_inventory to load vars for managed_node3 11389 1726854872.32717: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854872.32726: Calling all_plugins_play to load vars for managed_node3 11389 1726854872.32729: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854872.32732: Calling groups_plugins_play to load vars for managed_node3 11389 1726854872.34077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854872.36482: done with get_vars() 11389 1726854872.36715: done getting variables 11389 1726854872.36768: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:54:32 -0400 (0:00:00.788) 0:00:24.791 ****** 11389 1726854872.36813: entering _queue_task() for managed_node3/debug 11389 1726854872.37525: worker is 1 (out of 1 available) 11389 1726854872.37538: exiting _queue_task() for managed_node3/debug 11389 1726854872.37549: done queuing things up, now waiting for results queue to drain 11389 1726854872.37551: waiting for pending results... 11389 1726854872.38003: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 11389 1726854872.38394: in run() - task 0affcc66-ac2b-deb8-c119-00000000007d 11389 1726854872.38398: variable 'ansible_search_path' from source: unknown 11389 1726854872.38400: variable 'ansible_search_path' from source: unknown 11389 1726854872.38403: calling self._execute() 11389 1726854872.38405: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854872.38408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854872.38411: variable 'omit' from source: magic vars 11389 1726854872.39118: variable 'ansible_distribution_major_version' from source: facts 11389 1726854872.39392: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854872.39396: variable 'omit' from source: magic vars 11389 1726854872.39399: variable 'omit' from source: magic vars 11389 1726854872.39484: variable 'network_provider' from source: set_fact 11389 1726854872.39792: variable 'omit' from source: magic vars 11389 1726854872.39796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854872.39799: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854872.39814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854872.39838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854872.40291: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854872.40295: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854872.40297: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854872.40299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854872.40301: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854872.40303: Set connection var ansible_timeout to 10 11389 1726854872.40304: Set connection var ansible_connection to ssh 11389 1726854872.40306: Set connection var ansible_shell_type to sh 11389 1726854872.40308: Set connection var ansible_pipelining to False 11389 1726854872.40309: Set connection var ansible_shell_executable to /bin/sh 11389 1726854872.40311: variable 'ansible_shell_executable' from source: unknown 11389 1726854872.40313: variable 'ansible_connection' from source: unknown 11389 1726854872.40319: variable 'ansible_module_compression' from source: unknown 11389 1726854872.40324: variable 'ansible_shell_type' from source: unknown 11389 1726854872.40329: variable 'ansible_shell_executable' from source: unknown 11389 1726854872.40334: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854872.40340: variable 'ansible_pipelining' from source: unknown 11389 1726854872.40345: variable 'ansible_timeout' from source: unknown 11389 1726854872.40350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854872.40475: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854872.40708: variable 'omit' from source: magic vars 11389 1726854872.40719: starting attempt loop 11389 1726854872.40725: running the handler 11389 1726854872.40779: handler run complete 11389 1726854872.40803: attempt loop complete, returning result 11389 1726854872.40810: _execute() done 11389 1726854872.40817: dumping result to json 11389 1726854872.40825: done dumping result, returning 11389 1726854872.40836: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-deb8-c119-00000000007d] 11389 1726854872.40846: sending task result for task 0affcc66-ac2b-deb8-c119-00000000007d 11389 1726854872.41259: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000007d 11389 1726854872.41263: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 11389 1726854872.41327: no more pending results, returning what we have 11389 1726854872.41330: results queue empty 11389 1726854872.41331: checking for any_errors_fatal 11389 1726854872.41340: done checking for any_errors_fatal 11389 1726854872.41340: checking for max_fail_percentage 11389 1726854872.41343: done checking for max_fail_percentage 11389 1726854872.41343: checking to see if all hosts have failed and the running result is not ok 11389 1726854872.41344: done checking to see if all hosts have failed 11389 1726854872.41345: getting the remaining hosts for this loop 11389 1726854872.41346: done getting the remaining hosts for this loop 11389 1726854872.41350: getting the next task for host managed_node3 11389 1726854872.41358: done getting next task for host managed_node3 11389 1726854872.41362: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11389 1726854872.41366: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854872.41379: getting variables 11389 1726854872.41380: in VariableManager get_vars() 11389 1726854872.41419: Calling all_inventory to load vars for managed_node3 11389 1726854872.41422: Calling groups_inventory to load vars for managed_node3 11389 1726854872.41424: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854872.41432: Calling all_plugins_play to load vars for managed_node3 11389 1726854872.41434: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854872.41437: Calling groups_plugins_play to load vars for managed_node3 11389 1726854872.53303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854872.55498: done with get_vars() 11389 1726854872.55531: done getting variables 11389 1726854872.55579: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:54:32 -0400 (0:00:00.189) 0:00:24.980 ****** 11389 1726854872.55717: entering _queue_task() for managed_node3/fail 11389 1726854872.56493: worker is 1 (out of 1 available) 11389 1726854872.56507: exiting _queue_task() for managed_node3/fail 11389 1726854872.56521: done queuing things up, now waiting for results queue to drain 11389 1726854872.56523: waiting for pending results... 11389 1726854872.56982: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11389 1726854872.57154: in run() - task 0affcc66-ac2b-deb8-c119-00000000007e 11389 1726854872.57173: variable 'ansible_search_path' from source: unknown 11389 1726854872.57182: variable 'ansible_search_path' from source: unknown 11389 1726854872.57225: calling self._execute() 11389 1726854872.57326: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854872.57373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854872.57390: variable 'omit' from source: magic vars 11389 1726854872.57844: variable 'ansible_distribution_major_version' from source: facts 11389 1726854872.57901: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854872.58075: variable 'network_state' from source: role '' defaults 11389 1726854872.58094: Evaluated conditional (network_state != {}): False 11389 1726854872.58394: when evaluation is False, skipping this task 11389 1726854872.58398: _execute() done 11389 1726854872.58401: dumping result to json 11389 1726854872.58403: done dumping result, returning 11389 1726854872.58408: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-deb8-c119-00000000007e] 11389 1726854872.58411: sending task result for task 0affcc66-ac2b-deb8-c119-00000000007e 11389 1726854872.58477: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000007e 11389 1726854872.58480: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11389 1726854872.58545: no more pending results, returning what we have 11389 1726854872.58548: results queue empty 11389 1726854872.58549: checking for any_errors_fatal 11389 1726854872.58557: done checking for any_errors_fatal 11389 1726854872.58558: checking for max_fail_percentage 11389 1726854872.58560: done checking for max_fail_percentage 11389 1726854872.58561: checking to see if all hosts have failed and the running result is not ok 11389 1726854872.58562: done checking to see if all hosts have failed 11389 1726854872.58563: getting the remaining hosts for this loop 11389 1726854872.58564: done getting the remaining hosts for this loop 11389 1726854872.58568: getting the next task for host managed_node3 11389 1726854872.58576: done getting next task for host managed_node3 11389 1726854872.58581: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11389 1726854872.58585: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854872.58614: getting variables 11389 1726854872.58616: in VariableManager get_vars() 11389 1726854872.58657: Calling all_inventory to load vars for managed_node3 11389 1726854872.58660: Calling groups_inventory to load vars for managed_node3 11389 1726854872.58663: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854872.58675: Calling all_plugins_play to load vars for managed_node3 11389 1726854872.58679: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854872.58682: Calling groups_plugins_play to load vars for managed_node3 11389 1726854872.61134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854872.62844: done with get_vars() 11389 1726854872.62873: done getting variables 11389 1726854872.62931: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:54:32 -0400 (0:00:00.072) 0:00:25.052 ****** 11389 1726854872.62967: entering _queue_task() for managed_node3/fail 11389 1726854872.63303: worker is 1 (out of 1 available) 11389 1726854872.63319: exiting _queue_task() for managed_node3/fail 11389 1726854872.63330: done queuing things up, now waiting for results queue to drain 11389 1726854872.63332: waiting for pending results... 11389 1726854872.63778: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11389 1726854872.63783: in run() - task 0affcc66-ac2b-deb8-c119-00000000007f 11389 1726854872.63786: variable 'ansible_search_path' from source: unknown 11389 1726854872.63792: variable 'ansible_search_path' from source: unknown 11389 1726854872.63796: calling self._execute() 11389 1726854872.63959: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854872.63963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854872.63982: variable 'omit' from source: magic vars 11389 1726854872.64646: variable 'ansible_distribution_major_version' from source: facts 11389 1726854872.64658: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854872.65143: variable 'network_state' from source: role '' defaults 11389 1726854872.65175: Evaluated conditional (network_state != {}): False 11389 1726854872.65179: when evaluation is False, skipping this task 11389 1726854872.65182: _execute() done 11389 1726854872.65209: dumping result to json 11389 1726854872.65212: done dumping result, returning 11389 1726854872.65223: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-deb8-c119-00000000007f] 11389 1726854872.65229: sending task result for task 0affcc66-ac2b-deb8-c119-00000000007f 11389 1726854872.65535: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000007f 11389 1726854872.65538: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11389 1726854872.65577: no more pending results, returning what we have 11389 1726854872.65580: results queue empty 11389 1726854872.65581: checking for any_errors_fatal 11389 1726854872.65585: done checking for any_errors_fatal 11389 1726854872.65586: checking for max_fail_percentage 11389 1726854872.65589: done checking for max_fail_percentage 11389 1726854872.65590: checking to see if all hosts have failed and the running result is not ok 11389 1726854872.65591: done checking to see if all hosts have failed 11389 1726854872.65592: getting the remaining hosts for this loop 11389 1726854872.65594: done getting the remaining hosts for this loop 11389 1726854872.65597: getting the next task for host managed_node3 11389 1726854872.65603: done getting next task for host managed_node3 11389 1726854872.65607: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11389 1726854872.65611: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854872.65628: getting variables 11389 1726854872.65629: in VariableManager get_vars() 11389 1726854872.65670: Calling all_inventory to load vars for managed_node3 11389 1726854872.65673: Calling groups_inventory to load vars for managed_node3 11389 1726854872.65676: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854872.65684: Calling all_plugins_play to load vars for managed_node3 11389 1726854872.65689: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854872.65693: Calling groups_plugins_play to load vars for managed_node3 11389 1726854872.67137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854872.68772: done with get_vars() 11389 1726854872.68802: done getting variables 11389 1726854872.68859: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:54:32 -0400 (0:00:00.059) 0:00:25.112 ****** 11389 1726854872.68899: entering _queue_task() for managed_node3/fail 11389 1726854872.69350: worker is 1 (out of 1 available) 11389 1726854872.69361: exiting _queue_task() for managed_node3/fail 11389 1726854872.69372: done queuing things up, now waiting for results queue to drain 11389 1726854872.69373: waiting for pending results... 11389 1726854872.69570: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11389 1726854872.69766: in run() - task 0affcc66-ac2b-deb8-c119-000000000080 11389 1726854872.69776: variable 'ansible_search_path' from source: unknown 11389 1726854872.69812: variable 'ansible_search_path' from source: unknown 11389 1726854872.69838: calling self._execute() 11389 1726854872.69954: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854872.69982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854872.69986: variable 'omit' from source: magic vars 11389 1726854872.70408: variable 'ansible_distribution_major_version' from source: facts 11389 1726854872.70462: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854872.70628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854872.72872: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854872.72994: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854872.73003: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854872.73043: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854872.73073: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854872.73161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854872.73203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854872.73238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854872.73281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854872.73302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854872.73406: variable 'ansible_distribution_major_version' from source: facts 11389 1726854872.73434: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11389 1726854872.73593: variable 'ansible_distribution' from source: facts 11389 1726854872.73597: variable '__network_rh_distros' from source: role '' defaults 11389 1726854872.73599: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11389 1726854872.73840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854872.73877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854872.73908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854872.73953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854872.73980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854872.74072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854872.74079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854872.74094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854872.74135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854872.74152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854872.74206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854872.74234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854872.74291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854872.74313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854872.74331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854872.74656: variable 'network_connections' from source: task vars 11389 1726854872.74673: variable 'port2_profile' from source: play vars 11389 1726854872.74826: variable 'port2_profile' from source: play vars 11389 1726854872.74829: variable 'port1_profile' from source: play vars 11389 1726854872.74834: variable 'port1_profile' from source: play vars 11389 1726854872.74845: variable 'controller_profile' from source: play vars 11389 1726854872.74909: variable 'controller_profile' from source: play vars 11389 1726854872.74922: variable 'network_state' from source: role '' defaults 11389 1726854872.75061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854872.75190: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854872.75232: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854872.75266: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854872.75307: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854872.75365: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854872.75397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854872.75423: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854872.75495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854872.75499: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11389 1726854872.75501: when evaluation is False, skipping this task 11389 1726854872.75503: _execute() done 11389 1726854872.75504: dumping result to json 11389 1726854872.75506: done dumping result, returning 11389 1726854872.75508: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-deb8-c119-000000000080] 11389 1726854872.75514: sending task result for task 0affcc66-ac2b-deb8-c119-000000000080 skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11389 1726854872.75764: no more pending results, returning what we have 11389 1726854872.75768: results queue empty 11389 1726854872.75769: checking for any_errors_fatal 11389 1726854872.75775: done checking for any_errors_fatal 11389 1726854872.75776: checking for max_fail_percentage 11389 1726854872.75778: done checking for max_fail_percentage 11389 1726854872.75779: checking to see if all hosts have failed and the running result is not ok 11389 1726854872.75780: done checking to see if all hosts have failed 11389 1726854872.75781: getting the remaining hosts for this loop 11389 1726854872.75783: done getting the remaining hosts for this loop 11389 1726854872.75788: getting the next task for host managed_node3 11389 1726854872.75797: done getting next task for host managed_node3 11389 1726854872.75802: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11389 1726854872.75806: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854872.75825: getting variables 11389 1726854872.75828: in VariableManager get_vars() 11389 1726854872.75873: Calling all_inventory to load vars for managed_node3 11389 1726854872.75876: Calling groups_inventory to load vars for managed_node3 11389 1726854872.75879: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854872.76009: Calling all_plugins_play to load vars for managed_node3 11389 1726854872.76013: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854872.76018: Calling groups_plugins_play to load vars for managed_node3 11389 1726854872.76700: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000080 11389 1726854872.76703: WORKER PROCESS EXITING 11389 1726854872.77527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854872.78998: done with get_vars() 11389 1726854872.79016: done getting variables 11389 1726854872.79059: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:54:32 -0400 (0:00:00.101) 0:00:25.213 ****** 11389 1726854872.79084: entering _queue_task() for managed_node3/dnf 11389 1726854872.79323: worker is 1 (out of 1 available) 11389 1726854872.79337: exiting _queue_task() for managed_node3/dnf 11389 1726854872.79348: done queuing things up, now waiting for results queue to drain 11389 1726854872.79350: waiting for pending results... 11389 1726854872.79533: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11389 1726854872.79626: in run() - task 0affcc66-ac2b-deb8-c119-000000000081 11389 1726854872.79636: variable 'ansible_search_path' from source: unknown 11389 1726854872.79640: variable 'ansible_search_path' from source: unknown 11389 1726854872.79669: calling self._execute() 11389 1726854872.79750: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854872.79754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854872.79763: variable 'omit' from source: magic vars 11389 1726854872.80055: variable 'ansible_distribution_major_version' from source: facts 11389 1726854872.80065: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854872.80208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854872.82208: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854872.82264: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854872.82297: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854872.82324: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854872.82343: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854872.82408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854872.82432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854872.82450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854872.82478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854872.82491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854872.82578: variable 'ansible_distribution' from source: facts 11389 1726854872.82582: variable 'ansible_distribution_major_version' from source: facts 11389 1726854872.82598: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11389 1726854872.82685: variable '__network_wireless_connections_defined' from source: role '' defaults 11389 1726854872.82899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854872.82902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854872.82904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854872.82923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854872.82942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854872.82984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854872.83025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854872.83056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854872.83103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854872.83225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854872.83228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854872.83230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854872.83232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854872.83272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854872.83293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854872.83466: variable 'network_connections' from source: task vars 11389 1726854872.83484: variable 'port2_profile' from source: play vars 11389 1726854872.83563: variable 'port2_profile' from source: play vars 11389 1726854872.83584: variable 'port1_profile' from source: play vars 11389 1726854872.83664: variable 'port1_profile' from source: play vars 11389 1726854872.83678: variable 'controller_profile' from source: play vars 11389 1726854872.83741: variable 'controller_profile' from source: play vars 11389 1726854872.83836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854872.84093: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854872.84149: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854872.84252: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854872.84260: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854872.84297: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854872.84339: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854872.84375: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854872.84427: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854872.84490: variable '__network_team_connections_defined' from source: role '' defaults 11389 1726854872.84776: variable 'network_connections' from source: task vars 11389 1726854872.84802: variable 'port2_profile' from source: play vars 11389 1726854872.84862: variable 'port2_profile' from source: play vars 11389 1726854872.84878: variable 'port1_profile' from source: play vars 11389 1726854872.84971: variable 'port1_profile' from source: play vars 11389 1726854872.84974: variable 'controller_profile' from source: play vars 11389 1726854872.85049: variable 'controller_profile' from source: play vars 11389 1726854872.85145: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11389 1726854872.85148: when evaluation is False, skipping this task 11389 1726854872.85151: _execute() done 11389 1726854872.85153: dumping result to json 11389 1726854872.85155: done dumping result, returning 11389 1726854872.85158: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-deb8-c119-000000000081] 11389 1726854872.85160: sending task result for task 0affcc66-ac2b-deb8-c119-000000000081 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11389 1726854872.85310: no more pending results, returning what we have 11389 1726854872.85313: results queue empty 11389 1726854872.85314: checking for any_errors_fatal 11389 1726854872.85320: done checking for any_errors_fatal 11389 1726854872.85321: checking for max_fail_percentage 11389 1726854872.85323: done checking for max_fail_percentage 11389 1726854872.85324: checking to see if all hosts have failed and the running result is not ok 11389 1726854872.85325: done checking to see if all hosts have failed 11389 1726854872.85325: getting the remaining hosts for this loop 11389 1726854872.85327: done getting the remaining hosts for this loop 11389 1726854872.85330: getting the next task for host managed_node3 11389 1726854872.85337: done getting next task for host managed_node3 11389 1726854872.85340: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11389 1726854872.85343: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854872.85363: getting variables 11389 1726854872.85365: in VariableManager get_vars() 11389 1726854872.85413: Calling all_inventory to load vars for managed_node3 11389 1726854872.85416: Calling groups_inventory to load vars for managed_node3 11389 1726854872.85418: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854872.85429: Calling all_plugins_play to load vars for managed_node3 11389 1726854872.85432: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854872.85434: Calling groups_plugins_play to load vars for managed_node3 11389 1726854872.86001: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000081 11389 1726854872.86004: WORKER PROCESS EXITING 11389 1726854872.86526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854872.88128: done with get_vars() 11389 1726854872.88167: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11389 1726854872.88260: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:54:32 -0400 (0:00:00.092) 0:00:25.306 ****** 11389 1726854872.88297: entering _queue_task() for managed_node3/yum 11389 1726854872.88761: worker is 1 (out of 1 available) 11389 1726854872.88790: exiting _queue_task() for managed_node3/yum 11389 1726854872.88800: done queuing things up, now waiting for results queue to drain 11389 1726854872.88801: waiting for pending results... 11389 1726854872.89305: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11389 1726854872.89401: in run() - task 0affcc66-ac2b-deb8-c119-000000000082 11389 1726854872.89514: variable 'ansible_search_path' from source: unknown 11389 1726854872.89525: variable 'ansible_search_path' from source: unknown 11389 1726854872.89569: calling self._execute() 11389 1726854872.89792: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854872.89965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854872.89971: variable 'omit' from source: magic vars 11389 1726854872.90559: variable 'ansible_distribution_major_version' from source: facts 11389 1726854872.90579: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854872.90780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854872.93484: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854872.93896: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854872.93900: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854872.93903: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854872.93906: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854872.94111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854872.94152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854872.94204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854872.94255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854872.94280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854872.94398: variable 'ansible_distribution_major_version' from source: facts 11389 1726854872.94422: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11389 1726854872.94429: when evaluation is False, skipping this task 11389 1726854872.94441: _execute() done 11389 1726854872.94452: dumping result to json 11389 1726854872.94460: done dumping result, returning 11389 1726854872.94475: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-deb8-c119-000000000082] 11389 1726854872.94486: sending task result for task 0affcc66-ac2b-deb8-c119-000000000082 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11389 1726854872.94760: no more pending results, returning what we have 11389 1726854872.94764: results queue empty 11389 1726854872.94768: checking for any_errors_fatal 11389 1726854872.94775: done checking for any_errors_fatal 11389 1726854872.94776: checking for max_fail_percentage 11389 1726854872.94778: done checking for max_fail_percentage 11389 1726854872.94779: checking to see if all hosts have failed and the running result is not ok 11389 1726854872.94780: done checking to see if all hosts have failed 11389 1726854872.94780: getting the remaining hosts for this loop 11389 1726854872.94782: done getting the remaining hosts for this loop 11389 1726854872.94785: getting the next task for host managed_node3 11389 1726854872.94795: done getting next task for host managed_node3 11389 1726854872.94800: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11389 1726854872.94804: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854872.94824: getting variables 11389 1726854872.94827: in VariableManager get_vars() 11389 1726854872.94871: Calling all_inventory to load vars for managed_node3 11389 1726854872.94875: Calling groups_inventory to load vars for managed_node3 11389 1726854872.94878: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854872.94894: Calling all_plugins_play to load vars for managed_node3 11389 1726854872.94898: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854872.94902: Calling groups_plugins_play to load vars for managed_node3 11389 1726854872.95601: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000082 11389 1726854872.95604: WORKER PROCESS EXITING 11389 1726854872.95972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854872.96925: done with get_vars() 11389 1726854872.96947: done getting variables 11389 1726854872.97011: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:54:32 -0400 (0:00:00.087) 0:00:25.393 ****** 11389 1726854872.97048: entering _queue_task() for managed_node3/fail 11389 1726854872.97398: worker is 1 (out of 1 available) 11389 1726854872.97410: exiting _queue_task() for managed_node3/fail 11389 1726854872.97422: done queuing things up, now waiting for results queue to drain 11389 1726854872.97423: waiting for pending results... 11389 1726854872.97823: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11389 1726854872.97862: in run() - task 0affcc66-ac2b-deb8-c119-000000000083 11389 1726854872.97878: variable 'ansible_search_path' from source: unknown 11389 1726854872.97882: variable 'ansible_search_path' from source: unknown 11389 1726854872.97914: calling self._execute() 11389 1726854872.97994: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854872.97998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854872.98007: variable 'omit' from source: magic vars 11389 1726854872.98294: variable 'ansible_distribution_major_version' from source: facts 11389 1726854872.98306: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854872.98395: variable '__network_wireless_connections_defined' from source: role '' defaults 11389 1726854872.98530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854873.00153: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854873.00283: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854873.00286: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854873.00315: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854873.00468: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854873.00472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854873.00475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854873.00498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.00539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854873.00692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854873.00696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854873.00698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854873.00700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.00710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854873.00728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854873.00773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854873.00803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854873.00829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.00873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854873.00900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854873.01083: variable 'network_connections' from source: task vars 11389 1726854873.01108: variable 'port2_profile' from source: play vars 11389 1726854873.01181: variable 'port2_profile' from source: play vars 11389 1726854873.01206: variable 'port1_profile' from source: play vars 11389 1726854873.01265: variable 'port1_profile' from source: play vars 11389 1726854873.01288: variable 'controller_profile' from source: play vars 11389 1726854873.01332: variable 'controller_profile' from source: play vars 11389 1726854873.01384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854873.01518: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854873.01547: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854873.01571: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854873.01594: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854873.01624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854873.01641: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854873.01661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.01681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854873.01725: variable '__network_team_connections_defined' from source: role '' defaults 11389 1726854873.01881: variable 'network_connections' from source: task vars 11389 1726854873.01884: variable 'port2_profile' from source: play vars 11389 1726854873.01927: variable 'port2_profile' from source: play vars 11389 1726854873.01933: variable 'port1_profile' from source: play vars 11389 1726854873.01979: variable 'port1_profile' from source: play vars 11389 1726854873.01985: variable 'controller_profile' from source: play vars 11389 1726854873.02026: variable 'controller_profile' from source: play vars 11389 1726854873.02046: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11389 1726854873.02057: when evaluation is False, skipping this task 11389 1726854873.02059: _execute() done 11389 1726854873.02061: dumping result to json 11389 1726854873.02064: done dumping result, returning 11389 1726854873.02066: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-deb8-c119-000000000083] 11389 1726854873.02075: sending task result for task 0affcc66-ac2b-deb8-c119-000000000083 11389 1726854873.02162: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000083 11389 1726854873.02164: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11389 1726854873.02229: no more pending results, returning what we have 11389 1726854873.02233: results queue empty 11389 1726854873.02233: checking for any_errors_fatal 11389 1726854873.02239: done checking for any_errors_fatal 11389 1726854873.02240: checking for max_fail_percentage 11389 1726854873.02242: done checking for max_fail_percentage 11389 1726854873.02243: checking to see if all hosts have failed and the running result is not ok 11389 1726854873.02243: done checking to see if all hosts have failed 11389 1726854873.02244: getting the remaining hosts for this loop 11389 1726854873.02245: done getting the remaining hosts for this loop 11389 1726854873.02249: getting the next task for host managed_node3 11389 1726854873.02257: done getting next task for host managed_node3 11389 1726854873.02260: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11389 1726854873.02264: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854873.02283: getting variables 11389 1726854873.02284: in VariableManager get_vars() 11389 1726854873.02324: Calling all_inventory to load vars for managed_node3 11389 1726854873.02327: Calling groups_inventory to load vars for managed_node3 11389 1726854873.02329: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854873.02339: Calling all_plugins_play to load vars for managed_node3 11389 1726854873.02341: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854873.02343: Calling groups_plugins_play to load vars for managed_node3 11389 1726854873.03350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854873.04803: done with get_vars() 11389 1726854873.04834: done getting variables 11389 1726854873.04899: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:54:33 -0400 (0:00:00.078) 0:00:25.472 ****** 11389 1726854873.04937: entering _queue_task() for managed_node3/package 11389 1726854873.05197: worker is 1 (out of 1 available) 11389 1726854873.05210: exiting _queue_task() for managed_node3/package 11389 1726854873.05221: done queuing things up, now waiting for results queue to drain 11389 1726854873.05223: waiting for pending results... 11389 1726854873.05409: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 11389 1726854873.05518: in run() - task 0affcc66-ac2b-deb8-c119-000000000084 11389 1726854873.05529: variable 'ansible_search_path' from source: unknown 11389 1726854873.05533: variable 'ansible_search_path' from source: unknown 11389 1726854873.05565: calling self._execute() 11389 1726854873.05640: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854873.05644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854873.05653: variable 'omit' from source: magic vars 11389 1726854873.05936: variable 'ansible_distribution_major_version' from source: facts 11389 1726854873.05945: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854873.06081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854873.06275: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854873.06308: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854873.06335: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854873.06392: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854873.06469: variable 'network_packages' from source: role '' defaults 11389 1726854873.06543: variable '__network_provider_setup' from source: role '' defaults 11389 1726854873.06551: variable '__network_service_name_default_nm' from source: role '' defaults 11389 1726854873.06600: variable '__network_service_name_default_nm' from source: role '' defaults 11389 1726854873.06607: variable '__network_packages_default_nm' from source: role '' defaults 11389 1726854873.06658: variable '__network_packages_default_nm' from source: role '' defaults 11389 1726854873.06914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854873.08718: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854873.08763: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854873.08793: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854873.08817: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854873.08836: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854873.08898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854873.08921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854873.08937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.08962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854873.08977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854873.09010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854873.09025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854873.09041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.09068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854873.09079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854873.09222: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11389 1726854873.09299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854873.09329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854873.09345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.09371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854873.09382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854873.09446: variable 'ansible_python' from source: facts 11389 1726854873.09469: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11389 1726854873.09528: variable '__network_wpa_supplicant_required' from source: role '' defaults 11389 1726854873.09580: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11389 1726854873.09664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854873.09681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854873.09700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.09723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854873.09733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854873.09770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854873.09788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854873.09805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.09828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854873.09839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854873.09936: variable 'network_connections' from source: task vars 11389 1726854873.09942: variable 'port2_profile' from source: play vars 11389 1726854873.10041: variable 'port2_profile' from source: play vars 11389 1726854873.10045: variable 'port1_profile' from source: play vars 11389 1726854873.10116: variable 'port1_profile' from source: play vars 11389 1726854873.10124: variable 'controller_profile' from source: play vars 11389 1726854873.10195: variable 'controller_profile' from source: play vars 11389 1726854873.10243: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854873.10262: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854873.10285: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.10311: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854873.10346: variable '__network_wireless_connections_defined' from source: role '' defaults 11389 1726854873.10522: variable 'network_connections' from source: task vars 11389 1726854873.10525: variable 'port2_profile' from source: play vars 11389 1726854873.10594: variable 'port2_profile' from source: play vars 11389 1726854873.10603: variable 'port1_profile' from source: play vars 11389 1726854873.10733: variable 'port1_profile' from source: play vars 11389 1726854873.10737: variable 'controller_profile' from source: play vars 11389 1726854873.10746: variable 'controller_profile' from source: play vars 11389 1726854873.10771: variable '__network_packages_default_wireless' from source: role '' defaults 11389 1726854873.10825: variable '__network_wireless_connections_defined' from source: role '' defaults 11389 1726854873.11018: variable 'network_connections' from source: task vars 11389 1726854873.11021: variable 'port2_profile' from source: play vars 11389 1726854873.11070: variable 'port2_profile' from source: play vars 11389 1726854873.11074: variable 'port1_profile' from source: play vars 11389 1726854873.11119: variable 'port1_profile' from source: play vars 11389 1726854873.11125: variable 'controller_profile' from source: play vars 11389 1726854873.11169: variable 'controller_profile' from source: play vars 11389 1726854873.11192: variable '__network_packages_default_team' from source: role '' defaults 11389 1726854873.11244: variable '__network_team_connections_defined' from source: role '' defaults 11389 1726854873.11692: variable 'network_connections' from source: task vars 11389 1726854873.11695: variable 'port2_profile' from source: play vars 11389 1726854873.11697: variable 'port2_profile' from source: play vars 11389 1726854873.11699: variable 'port1_profile' from source: play vars 11389 1726854873.11700: variable 'port1_profile' from source: play vars 11389 1726854873.11702: variable 'controller_profile' from source: play vars 11389 1726854873.11762: variable 'controller_profile' from source: play vars 11389 1726854873.11822: variable '__network_service_name_default_initscripts' from source: role '' defaults 11389 1726854873.11885: variable '__network_service_name_default_initscripts' from source: role '' defaults 11389 1726854873.11902: variable '__network_packages_default_initscripts' from source: role '' defaults 11389 1726854873.11962: variable '__network_packages_default_initscripts' from source: role '' defaults 11389 1726854873.12191: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11389 1726854873.12613: variable 'network_connections' from source: task vars 11389 1726854873.12616: variable 'port2_profile' from source: play vars 11389 1726854873.12661: variable 'port2_profile' from source: play vars 11389 1726854873.12668: variable 'port1_profile' from source: play vars 11389 1726854873.12712: variable 'port1_profile' from source: play vars 11389 1726854873.12718: variable 'controller_profile' from source: play vars 11389 1726854873.12760: variable 'controller_profile' from source: play vars 11389 1726854873.12764: variable 'ansible_distribution' from source: facts 11389 1726854873.12771: variable '__network_rh_distros' from source: role '' defaults 11389 1726854873.12778: variable 'ansible_distribution_major_version' from source: facts 11389 1726854873.12791: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11389 1726854873.12899: variable 'ansible_distribution' from source: facts 11389 1726854873.12902: variable '__network_rh_distros' from source: role '' defaults 11389 1726854873.12906: variable 'ansible_distribution_major_version' from source: facts 11389 1726854873.12917: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11389 1726854873.13023: variable 'ansible_distribution' from source: facts 11389 1726854873.13027: variable '__network_rh_distros' from source: role '' defaults 11389 1726854873.13031: variable 'ansible_distribution_major_version' from source: facts 11389 1726854873.13057: variable 'network_provider' from source: set_fact 11389 1726854873.13068: variable 'ansible_facts' from source: unknown 11389 1726854873.13449: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11389 1726854873.13452: when evaluation is False, skipping this task 11389 1726854873.13455: _execute() done 11389 1726854873.13458: dumping result to json 11389 1726854873.13460: done dumping result, returning 11389 1726854873.13468: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-deb8-c119-000000000084] 11389 1726854873.13475: sending task result for task 0affcc66-ac2b-deb8-c119-000000000084 11389 1726854873.13565: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000084 11389 1726854873.13568: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11389 1726854873.13622: no more pending results, returning what we have 11389 1726854873.13626: results queue empty 11389 1726854873.13626: checking for any_errors_fatal 11389 1726854873.13632: done checking for any_errors_fatal 11389 1726854873.13632: checking for max_fail_percentage 11389 1726854873.13634: done checking for max_fail_percentage 11389 1726854873.13635: checking to see if all hosts have failed and the running result is not ok 11389 1726854873.13636: done checking to see if all hosts have failed 11389 1726854873.13637: getting the remaining hosts for this loop 11389 1726854873.13638: done getting the remaining hosts for this loop 11389 1726854873.13645: getting the next task for host managed_node3 11389 1726854873.13652: done getting next task for host managed_node3 11389 1726854873.13655: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11389 1726854873.13659: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854873.13681: getting variables 11389 1726854873.13683: in VariableManager get_vars() 11389 1726854873.13727: Calling all_inventory to load vars for managed_node3 11389 1726854873.13730: Calling groups_inventory to load vars for managed_node3 11389 1726854873.13732: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854873.13742: Calling all_plugins_play to load vars for managed_node3 11389 1726854873.13745: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854873.13747: Calling groups_plugins_play to load vars for managed_node3 11389 1726854873.15256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854873.16793: done with get_vars() 11389 1726854873.16815: done getting variables 11389 1726854873.16871: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:54:33 -0400 (0:00:00.119) 0:00:25.592 ****** 11389 1726854873.16906: entering _queue_task() for managed_node3/package 11389 1726854873.17205: worker is 1 (out of 1 available) 11389 1726854873.17218: exiting _queue_task() for managed_node3/package 11389 1726854873.17228: done queuing things up, now waiting for results queue to drain 11389 1726854873.17229: waiting for pending results... 11389 1726854873.17609: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11389 1726854873.17642: in run() - task 0affcc66-ac2b-deb8-c119-000000000085 11389 1726854873.17661: variable 'ansible_search_path' from source: unknown 11389 1726854873.17669: variable 'ansible_search_path' from source: unknown 11389 1726854873.17712: calling self._execute() 11389 1726854873.17817: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854873.17828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854873.17843: variable 'omit' from source: magic vars 11389 1726854873.18197: variable 'ansible_distribution_major_version' from source: facts 11389 1726854873.18215: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854873.18347: variable 'network_state' from source: role '' defaults 11389 1726854873.18370: Evaluated conditional (network_state != {}): False 11389 1726854873.18380: when evaluation is False, skipping this task 11389 1726854873.18466: _execute() done 11389 1726854873.18469: dumping result to json 11389 1726854873.18471: done dumping result, returning 11389 1726854873.18474: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-deb8-c119-000000000085] 11389 1726854873.18476: sending task result for task 0affcc66-ac2b-deb8-c119-000000000085 11389 1726854873.18551: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000085 11389 1726854873.18554: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11389 1726854873.18621: no more pending results, returning what we have 11389 1726854873.18626: results queue empty 11389 1726854873.18626: checking for any_errors_fatal 11389 1726854873.18634: done checking for any_errors_fatal 11389 1726854873.18634: checking for max_fail_percentage 11389 1726854873.18637: done checking for max_fail_percentage 11389 1726854873.18638: checking to see if all hosts have failed and the running result is not ok 11389 1726854873.18639: done checking to see if all hosts have failed 11389 1726854873.18640: getting the remaining hosts for this loop 11389 1726854873.18641: done getting the remaining hosts for this loop 11389 1726854873.18644: getting the next task for host managed_node3 11389 1726854873.18652: done getting next task for host managed_node3 11389 1726854873.18658: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11389 1726854873.18662: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854873.18682: getting variables 11389 1726854873.18685: in VariableManager get_vars() 11389 1726854873.18728: Calling all_inventory to load vars for managed_node3 11389 1726854873.18732: Calling groups_inventory to load vars for managed_node3 11389 1726854873.18735: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854873.18748: Calling all_plugins_play to load vars for managed_node3 11389 1726854873.18751: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854873.18755: Calling groups_plugins_play to load vars for managed_node3 11389 1726854873.20284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854873.21859: done with get_vars() 11389 1726854873.21882: done getting variables 11389 1726854873.21939: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:54:33 -0400 (0:00:00.050) 0:00:25.642 ****** 11389 1726854873.21973: entering _queue_task() for managed_node3/package 11389 1726854873.22260: worker is 1 (out of 1 available) 11389 1726854873.22273: exiting _queue_task() for managed_node3/package 11389 1726854873.22284: done queuing things up, now waiting for results queue to drain 11389 1726854873.22286: waiting for pending results... 11389 1726854873.22617: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11389 1726854873.22709: in run() - task 0affcc66-ac2b-deb8-c119-000000000086 11389 1726854873.22731: variable 'ansible_search_path' from source: unknown 11389 1726854873.22739: variable 'ansible_search_path' from source: unknown 11389 1726854873.22778: calling self._execute() 11389 1726854873.22933: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854873.22937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854873.22939: variable 'omit' from source: magic vars 11389 1726854873.23272: variable 'ansible_distribution_major_version' from source: facts 11389 1726854873.23291: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854873.23412: variable 'network_state' from source: role '' defaults 11389 1726854873.23428: Evaluated conditional (network_state != {}): False 11389 1726854873.23436: when evaluation is False, skipping this task 11389 1726854873.23443: _execute() done 11389 1726854873.23448: dumping result to json 11389 1726854873.23456: done dumping result, returning 11389 1726854873.23466: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-deb8-c119-000000000086] 11389 1726854873.23584: sending task result for task 0affcc66-ac2b-deb8-c119-000000000086 11389 1726854873.23650: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000086 11389 1726854873.23654: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11389 1726854873.23741: no more pending results, returning what we have 11389 1726854873.23745: results queue empty 11389 1726854873.23746: checking for any_errors_fatal 11389 1726854873.23756: done checking for any_errors_fatal 11389 1726854873.23757: checking for max_fail_percentage 11389 1726854873.23759: done checking for max_fail_percentage 11389 1726854873.23760: checking to see if all hosts have failed and the running result is not ok 11389 1726854873.23761: done checking to see if all hosts have failed 11389 1726854873.23762: getting the remaining hosts for this loop 11389 1726854873.23764: done getting the remaining hosts for this loop 11389 1726854873.23768: getting the next task for host managed_node3 11389 1726854873.23776: done getting next task for host managed_node3 11389 1726854873.23780: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11389 1726854873.23785: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854873.23810: getting variables 11389 1726854873.23813: in VariableManager get_vars() 11389 1726854873.23856: Calling all_inventory to load vars for managed_node3 11389 1726854873.23859: Calling groups_inventory to load vars for managed_node3 11389 1726854873.23862: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854873.23875: Calling all_plugins_play to load vars for managed_node3 11389 1726854873.23879: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854873.23882: Calling groups_plugins_play to load vars for managed_node3 11389 1726854873.25439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854873.26906: done with get_vars() 11389 1726854873.26928: done getting variables 11389 1726854873.26974: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:54:33 -0400 (0:00:00.050) 0:00:25.693 ****** 11389 1726854873.27004: entering _queue_task() for managed_node3/service 11389 1726854873.27284: worker is 1 (out of 1 available) 11389 1726854873.27301: exiting _queue_task() for managed_node3/service 11389 1726854873.27312: done queuing things up, now waiting for results queue to drain 11389 1726854873.27314: waiting for pending results... 11389 1726854873.27501: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11389 1726854873.27599: in run() - task 0affcc66-ac2b-deb8-c119-000000000087 11389 1726854873.27609: variable 'ansible_search_path' from source: unknown 11389 1726854873.27613: variable 'ansible_search_path' from source: unknown 11389 1726854873.27640: calling self._execute() 11389 1726854873.27718: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854873.27722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854873.27731: variable 'omit' from source: magic vars 11389 1726854873.28002: variable 'ansible_distribution_major_version' from source: facts 11389 1726854873.28012: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854873.28090: variable '__network_wireless_connections_defined' from source: role '' defaults 11389 1726854873.28223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854873.30514: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854873.30581: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854873.30611: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854873.30639: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854873.30659: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854873.30721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854873.30746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854873.30772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.30800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854873.30810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854873.30842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854873.30860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854873.30878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.30906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854873.30916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854873.30944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854873.30960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854873.30979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.31006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854873.31016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854873.31133: variable 'network_connections' from source: task vars 11389 1726854873.31143: variable 'port2_profile' from source: play vars 11389 1726854873.31192: variable 'port2_profile' from source: play vars 11389 1726854873.31202: variable 'port1_profile' from source: play vars 11389 1726854873.31245: variable 'port1_profile' from source: play vars 11389 1726854873.31252: variable 'controller_profile' from source: play vars 11389 1726854873.31297: variable 'controller_profile' from source: play vars 11389 1726854873.31346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854873.31468: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854873.31495: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854873.31518: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854873.31541: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854873.31572: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854873.31586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854873.31606: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.31625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854873.31664: variable '__network_team_connections_defined' from source: role '' defaults 11389 1726854873.31819: variable 'network_connections' from source: task vars 11389 1726854873.31822: variable 'port2_profile' from source: play vars 11389 1726854873.31872: variable 'port2_profile' from source: play vars 11389 1726854873.31875: variable 'port1_profile' from source: play vars 11389 1726854873.31920: variable 'port1_profile' from source: play vars 11389 1726854873.31926: variable 'controller_profile' from source: play vars 11389 1726854873.31972: variable 'controller_profile' from source: play vars 11389 1726854873.31991: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11389 1726854873.32001: when evaluation is False, skipping this task 11389 1726854873.32003: _execute() done 11389 1726854873.32006: dumping result to json 11389 1726854873.32009: done dumping result, returning 11389 1726854873.32011: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-deb8-c119-000000000087] 11389 1726854873.32013: sending task result for task 0affcc66-ac2b-deb8-c119-000000000087 11389 1726854873.32100: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000087 11389 1726854873.32103: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11389 1726854873.32145: no more pending results, returning what we have 11389 1726854873.32148: results queue empty 11389 1726854873.32148: checking for any_errors_fatal 11389 1726854873.32153: done checking for any_errors_fatal 11389 1726854873.32154: checking for max_fail_percentage 11389 1726854873.32156: done checking for max_fail_percentage 11389 1726854873.32157: checking to see if all hosts have failed and the running result is not ok 11389 1726854873.32158: done checking to see if all hosts have failed 11389 1726854873.32158: getting the remaining hosts for this loop 11389 1726854873.32159: done getting the remaining hosts for this loop 11389 1726854873.32162: getting the next task for host managed_node3 11389 1726854873.32171: done getting next task for host managed_node3 11389 1726854873.32175: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11389 1726854873.32179: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854873.32197: getting variables 11389 1726854873.32199: in VariableManager get_vars() 11389 1726854873.32262: Calling all_inventory to load vars for managed_node3 11389 1726854873.32265: Calling groups_inventory to load vars for managed_node3 11389 1726854873.32269: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854873.32279: Calling all_plugins_play to load vars for managed_node3 11389 1726854873.32282: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854873.32284: Calling groups_plugins_play to load vars for managed_node3 11389 1726854873.33661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854873.36168: done with get_vars() 11389 1726854873.36192: done getting variables 11389 1726854873.36251: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:54:33 -0400 (0:00:00.092) 0:00:25.785 ****** 11389 1726854873.36286: entering _queue_task() for managed_node3/service 11389 1726854873.36611: worker is 1 (out of 1 available) 11389 1726854873.36623: exiting _queue_task() for managed_node3/service 11389 1726854873.36634: done queuing things up, now waiting for results queue to drain 11389 1726854873.36635: waiting for pending results... 11389 1726854873.37066: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11389 1726854873.37208: in run() - task 0affcc66-ac2b-deb8-c119-000000000088 11389 1726854873.37224: variable 'ansible_search_path' from source: unknown 11389 1726854873.37548: variable 'ansible_search_path' from source: unknown 11389 1726854873.37552: calling self._execute() 11389 1726854873.37555: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854873.37601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854873.37686: variable 'omit' from source: magic vars 11389 1726854873.38398: variable 'ansible_distribution_major_version' from source: facts 11389 1726854873.38416: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854873.38718: variable 'network_provider' from source: set_fact 11389 1726854873.38730: variable 'network_state' from source: role '' defaults 11389 1726854873.38745: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11389 1726854873.39095: variable 'omit' from source: magic vars 11389 1726854873.39098: variable 'omit' from source: magic vars 11389 1726854873.39202: variable 'network_service_name' from source: role '' defaults 11389 1726854873.39206: variable 'network_service_name' from source: role '' defaults 11389 1726854873.39282: variable '__network_provider_setup' from source: role '' defaults 11389 1726854873.39405: variable '__network_service_name_default_nm' from source: role '' defaults 11389 1726854873.39476: variable '__network_service_name_default_nm' from source: role '' defaults 11389 1726854873.39539: variable '__network_packages_default_nm' from source: role '' defaults 11389 1726854873.39603: variable '__network_packages_default_nm' from source: role '' defaults 11389 1726854873.40102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854873.41764: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854873.41820: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854873.41846: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854873.41872: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854873.41896: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854873.41955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854873.41976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854873.41998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.42026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854873.42036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854873.42070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854873.42084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854873.42104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.42131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854873.42141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854873.42426: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11389 1726854873.42865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854873.42868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854873.42870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.42873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854873.42874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854873.42935: variable 'ansible_python' from source: facts 11389 1726854873.42962: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11389 1726854873.43058: variable '__network_wpa_supplicant_required' from source: role '' defaults 11389 1726854873.43159: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11389 1726854873.43294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854873.43334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854873.43362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.43406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854873.43447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854873.43513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854873.43580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854873.43658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.43724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854873.43798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854873.43992: variable 'network_connections' from source: task vars 11389 1726854873.43999: variable 'port2_profile' from source: play vars 11389 1726854873.44050: variable 'port2_profile' from source: play vars 11389 1726854873.44061: variable 'port1_profile' from source: play vars 11389 1726854873.44122: variable 'port1_profile' from source: play vars 11389 1726854873.44132: variable 'controller_profile' from source: play vars 11389 1726854873.44181: variable 'controller_profile' from source: play vars 11389 1726854873.44254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854873.44391: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854873.44430: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854873.44460: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854873.44491: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854873.44536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854873.44558: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854873.44581: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854873.44605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854873.44643: variable '__network_wireless_connections_defined' from source: role '' defaults 11389 1726854873.44822: variable 'network_connections' from source: task vars 11389 1726854873.44827: variable 'port2_profile' from source: play vars 11389 1726854873.44881: variable 'port2_profile' from source: play vars 11389 1726854873.44891: variable 'port1_profile' from source: play vars 11389 1726854873.44940: variable 'port1_profile' from source: play vars 11389 1726854873.44950: variable 'controller_profile' from source: play vars 11389 1726854873.45003: variable 'controller_profile' from source: play vars 11389 1726854873.45027: variable '__network_packages_default_wireless' from source: role '' defaults 11389 1726854873.45085: variable '__network_wireless_connections_defined' from source: role '' defaults 11389 1726854873.45268: variable 'network_connections' from source: task vars 11389 1726854873.45271: variable 'port2_profile' from source: play vars 11389 1726854873.45322: variable 'port2_profile' from source: play vars 11389 1726854873.45329: variable 'port1_profile' from source: play vars 11389 1726854873.45377: variable 'port1_profile' from source: play vars 11389 1726854873.45383: variable 'controller_profile' from source: play vars 11389 1726854873.45435: variable 'controller_profile' from source: play vars 11389 1726854873.45452: variable '__network_packages_default_team' from source: role '' defaults 11389 1726854873.45508: variable '__network_team_connections_defined' from source: role '' defaults 11389 1726854873.45691: variable 'network_connections' from source: task vars 11389 1726854873.45695: variable 'port2_profile' from source: play vars 11389 1726854873.45745: variable 'port2_profile' from source: play vars 11389 1726854873.45752: variable 'port1_profile' from source: play vars 11389 1726854873.45802: variable 'port1_profile' from source: play vars 11389 1726854873.45808: variable 'controller_profile' from source: play vars 11389 1726854873.45863: variable 'controller_profile' from source: play vars 11389 1726854873.45914: variable '__network_service_name_default_initscripts' from source: role '' defaults 11389 1726854873.45977: variable '__network_service_name_default_initscripts' from source: role '' defaults 11389 1726854873.46107: variable '__network_packages_default_initscripts' from source: role '' defaults 11389 1726854873.46110: variable '__network_packages_default_initscripts' from source: role '' defaults 11389 1726854873.46266: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11389 1726854873.46743: variable 'network_connections' from source: task vars 11389 1726854873.46753: variable 'port2_profile' from source: play vars 11389 1726854873.46818: variable 'port2_profile' from source: play vars 11389 1726854873.46831: variable 'port1_profile' from source: play vars 11389 1726854873.46890: variable 'port1_profile' from source: play vars 11389 1726854873.46904: variable 'controller_profile' from source: play vars 11389 1726854873.46963: variable 'controller_profile' from source: play vars 11389 1726854873.46976: variable 'ansible_distribution' from source: facts 11389 1726854873.46984: variable '__network_rh_distros' from source: role '' defaults 11389 1726854873.46998: variable 'ansible_distribution_major_version' from source: facts 11389 1726854873.47017: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11389 1726854873.47184: variable 'ansible_distribution' from source: facts 11389 1726854873.47199: variable '__network_rh_distros' from source: role '' defaults 11389 1726854873.47209: variable 'ansible_distribution_major_version' from source: facts 11389 1726854873.47227: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11389 1726854873.47391: variable 'ansible_distribution' from source: facts 11389 1726854873.47397: variable '__network_rh_distros' from source: role '' defaults 11389 1726854873.47431: variable 'ansible_distribution_major_version' from source: facts 11389 1726854873.47448: variable 'network_provider' from source: set_fact 11389 1726854873.47469: variable 'omit' from source: magic vars 11389 1726854873.47491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854873.47518: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854873.47528: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854873.47541: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854873.47557: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854873.47591: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854873.47594: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854873.47597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854873.47675: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854873.47681: Set connection var ansible_timeout to 10 11389 1726854873.47684: Set connection var ansible_connection to ssh 11389 1726854873.47690: Set connection var ansible_shell_type to sh 11389 1726854873.47695: Set connection var ansible_pipelining to False 11389 1726854873.47700: Set connection var ansible_shell_executable to /bin/sh 11389 1726854873.47718: variable 'ansible_shell_executable' from source: unknown 11389 1726854873.47724: variable 'ansible_connection' from source: unknown 11389 1726854873.47730: variable 'ansible_module_compression' from source: unknown 11389 1726854873.47733: variable 'ansible_shell_type' from source: unknown 11389 1726854873.47735: variable 'ansible_shell_executable' from source: unknown 11389 1726854873.47737: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854873.47739: variable 'ansible_pipelining' from source: unknown 11389 1726854873.47741: variable 'ansible_timeout' from source: unknown 11389 1726854873.47746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854873.47818: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854873.47828: variable 'omit' from source: magic vars 11389 1726854873.47833: starting attempt loop 11389 1726854873.47840: running the handler 11389 1726854873.47899: variable 'ansible_facts' from source: unknown 11389 1726854873.48351: _low_level_execute_command(): starting 11389 1726854873.48357: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854873.48839: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854873.48843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11389 1726854873.48846: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854873.48897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854873.48901: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854873.48990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854873.50683: stdout chunk (state=3): >>>/root <<< 11389 1726854873.50785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854873.50820: stderr chunk (state=3): >>><<< 11389 1726854873.50823: stdout chunk (state=3): >>><<< 11389 1726854873.50836: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854873.50893: _low_level_execute_command(): starting 11389 1726854873.50897: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854873.5084112-12638-35943191594149 `" && echo ansible-tmp-1726854873.5084112-12638-35943191594149="` echo /root/.ansible/tmp/ansible-tmp-1726854873.5084112-12638-35943191594149 `" ) && sleep 0' 11389 1726854873.51275: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854873.51278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854873.51281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854873.51283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 11389 1726854873.51285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854873.51337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854873.51345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854873.51347: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854873.51404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854873.53305: stdout chunk (state=3): >>>ansible-tmp-1726854873.5084112-12638-35943191594149=/root/.ansible/tmp/ansible-tmp-1726854873.5084112-12638-35943191594149 <<< 11389 1726854873.53407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854873.53432: stderr chunk (state=3): >>><<< 11389 1726854873.53435: stdout chunk (state=3): >>><<< 11389 1726854873.53450: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854873.5084112-12638-35943191594149=/root/.ansible/tmp/ansible-tmp-1726854873.5084112-12638-35943191594149 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854873.53480: variable 'ansible_module_compression' from source: unknown 11389 1726854873.53526: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11389 1726854873.53572: variable 'ansible_facts' from source: unknown 11389 1726854873.53707: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854873.5084112-12638-35943191594149/AnsiballZ_systemd.py 11389 1726854873.53810: Sending initial data 11389 1726854873.53814: Sent initial data (155 bytes) 11389 1726854873.54257: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854873.54262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found <<< 11389 1726854873.54264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11389 1726854873.54269: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854873.54271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854873.54323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854873.54327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854873.54329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854873.54395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854873.55944: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11389 1726854873.55948: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854873.56002: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854873.56064: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmpo6z2p1ix /root/.ansible/tmp/ansible-tmp-1726854873.5084112-12638-35943191594149/AnsiballZ_systemd.py <<< 11389 1726854873.56067: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854873.5084112-12638-35943191594149/AnsiballZ_systemd.py" <<< 11389 1726854873.56120: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmpo6z2p1ix" to remote "/root/.ansible/tmp/ansible-tmp-1726854873.5084112-12638-35943191594149/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854873.5084112-12638-35943191594149/AnsiballZ_systemd.py" <<< 11389 1726854873.57268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854873.57315: stderr chunk (state=3): >>><<< 11389 1726854873.57318: stdout chunk (state=3): >>><<< 11389 1726854873.57337: done transferring module to remote 11389 1726854873.57346: _low_level_execute_command(): starting 11389 1726854873.57351: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854873.5084112-12638-35943191594149/ /root/.ansible/tmp/ansible-tmp-1726854873.5084112-12638-35943191594149/AnsiballZ_systemd.py && sleep 0' 11389 1726854873.57940: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854873.57990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854873.59894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854873.59898: stdout chunk (state=3): >>><<< 11389 1726854873.59900: stderr chunk (state=3): >>><<< 11389 1726854873.59903: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854873.59905: _low_level_execute_command(): starting 11389 1726854873.59907: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854873.5084112-12638-35943191594149/AnsiballZ_systemd.py && sleep 0' 11389 1726854873.60434: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854873.60450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854873.60462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854873.60481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854873.60496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854873.60504: stderr chunk (state=3): >>>debug2: match not found <<< 11389 1726854873.60514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854873.60533: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11389 1726854873.60542: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 11389 1726854873.60609: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854873.60660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854873.60663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854873.60700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854873.60773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854873.89908: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10457088", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3325038592", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "679376000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11389 1726854873.91695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854873.91699: stdout chunk (state=3): >>><<< 11389 1726854873.91702: stderr chunk (state=3): >>><<< 11389 1726854873.92097: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "707", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainStartTimestampMonotonic": "21968417", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ExecMainHandoffTimestampMonotonic": "21983708", "ExecMainPID": "707", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10457088", "MemoryPeak": "14389248", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3325038592", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "679376000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service network.target NetworkManager-wait-online.service multi-user.target", "After": "dbus.socket system.slice sysinit.target basic.target cloud-init-local.service network-pre.target dbus-broker.service systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:53:43 EDT", "StateChangeTimestampMonotonic": "594577034", "InactiveExitTimestamp": "Fri 2024-09-20 13:44:10 EDT", "InactiveExitTimestampMonotonic": "21968779", "ActiveEnterTimestamp": "Fri 2024-09-20 13:44:11 EDT", "ActiveEnterTimestampMonotonic": "22424933", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:44:10 EDT", "ConditionTimestampMonotonic": "21967453", "AssertTimestamp": "Fri 2024-09-20 13:44:10 EDT", "AssertTimestampMonotonic": "21967456", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f4cf7eb47fc94dda90459896c834c364", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854873.92134: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854873.5084112-12638-35943191594149/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854873.92159: _low_level_execute_command(): starting 11389 1726854873.92209: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854873.5084112-12638-35943191594149/ > /dev/null 2>&1 && sleep 0' 11389 1726854873.93369: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854873.93668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854873.93680: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854873.93776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854873.95608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854873.95650: stderr chunk (state=3): >>><<< 11389 1726854873.95696: stdout chunk (state=3): >>><<< 11389 1726854873.95715: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854873.95722: handler run complete 11389 1726854873.95793: attempt loop complete, returning result 11389 1726854873.95796: _execute() done 11389 1726854873.95799: dumping result to json 11389 1726854873.95814: done dumping result, returning 11389 1726854873.95822: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-deb8-c119-000000000088] 11389 1726854873.95827: sending task result for task 0affcc66-ac2b-deb8-c119-000000000088 11389 1726854873.96698: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000088 11389 1726854873.96702: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11389 1726854873.96760: no more pending results, returning what we have 11389 1726854873.96763: results queue empty 11389 1726854873.96764: checking for any_errors_fatal 11389 1726854873.96773: done checking for any_errors_fatal 11389 1726854873.96774: checking for max_fail_percentage 11389 1726854873.96776: done checking for max_fail_percentage 11389 1726854873.96777: checking to see if all hosts have failed and the running result is not ok 11389 1726854873.96778: done checking to see if all hosts have failed 11389 1726854873.96779: getting the remaining hosts for this loop 11389 1726854873.96780: done getting the remaining hosts for this loop 11389 1726854873.96784: getting the next task for host managed_node3 11389 1726854873.96792: done getting next task for host managed_node3 11389 1726854873.96797: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11389 1726854873.96802: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854873.96815: getting variables 11389 1726854873.96816: in VariableManager get_vars() 11389 1726854873.96856: Calling all_inventory to load vars for managed_node3 11389 1726854873.96859: Calling groups_inventory to load vars for managed_node3 11389 1726854873.96862: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854873.96875: Calling all_plugins_play to load vars for managed_node3 11389 1726854873.96878: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854873.96881: Calling groups_plugins_play to load vars for managed_node3 11389 1726854873.99217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854874.00875: done with get_vars() 11389 1726854874.00905: done getting variables 11389 1726854874.00969: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:54:34 -0400 (0:00:00.647) 0:00:26.433 ****** 11389 1726854874.01008: entering _queue_task() for managed_node3/service 11389 1726854874.01343: worker is 1 (out of 1 available) 11389 1726854874.01355: exiting _queue_task() for managed_node3/service 11389 1726854874.01365: done queuing things up, now waiting for results queue to drain 11389 1726854874.01369: waiting for pending results... 11389 1726854874.02105: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11389 1726854874.02206: in run() - task 0affcc66-ac2b-deb8-c119-000000000089 11389 1726854874.02321: variable 'ansible_search_path' from source: unknown 11389 1726854874.02333: variable 'ansible_search_path' from source: unknown 11389 1726854874.02380: calling self._execute() 11389 1726854874.02491: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854874.02504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854874.02693: variable 'omit' from source: magic vars 11389 1726854874.03351: variable 'ansible_distribution_major_version' from source: facts 11389 1726854874.03372: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854874.03629: variable 'network_provider' from source: set_fact 11389 1726854874.03747: Evaluated conditional (network_provider == "nm"): True 11389 1726854874.03876: variable '__network_wpa_supplicant_required' from source: role '' defaults 11389 1726854874.03999: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11389 1726854874.04213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854874.07199: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854874.07218: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854874.07294: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854874.07298: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854874.07321: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854874.07410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854874.07441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854874.07594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854874.07597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854874.07599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854874.07601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854874.07604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854874.07702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854874.07705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854874.07708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854874.07710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854874.07744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854874.07769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854874.07806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854874.07819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854874.07972: variable 'network_connections' from source: task vars 11389 1726854874.08193: variable 'port2_profile' from source: play vars 11389 1726854874.08197: variable 'port2_profile' from source: play vars 11389 1726854874.08199: variable 'port1_profile' from source: play vars 11389 1726854874.08202: variable 'port1_profile' from source: play vars 11389 1726854874.08204: variable 'controller_profile' from source: play vars 11389 1726854874.08209: variable 'controller_profile' from source: play vars 11389 1726854874.08285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11389 1726854874.08495: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11389 1726854874.08530: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11389 1726854874.08557: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11389 1726854874.08583: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11389 1726854874.08964: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11389 1726854874.08983: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11389 1726854874.09014: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854874.09038: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11389 1726854874.09083: variable '__network_wireless_connections_defined' from source: role '' defaults 11389 1726854874.09386: variable 'network_connections' from source: task vars 11389 1726854874.09399: variable 'port2_profile' from source: play vars 11389 1726854874.09463: variable 'port2_profile' from source: play vars 11389 1726854874.09478: variable 'port1_profile' from source: play vars 11389 1726854874.09539: variable 'port1_profile' from source: play vars 11389 1726854874.09564: variable 'controller_profile' from source: play vars 11389 1726854874.09628: variable 'controller_profile' from source: play vars 11389 1726854874.09665: Evaluated conditional (__network_wpa_supplicant_required): False 11389 1726854874.09676: when evaluation is False, skipping this task 11389 1726854874.09683: _execute() done 11389 1726854874.09691: dumping result to json 11389 1726854874.09699: done dumping result, returning 11389 1726854874.09712: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-deb8-c119-000000000089] 11389 1726854874.09721: sending task result for task 0affcc66-ac2b-deb8-c119-000000000089 11389 1726854874.10047: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000089 11389 1726854874.10050: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11389 1726854874.10099: no more pending results, returning what we have 11389 1726854874.10102: results queue empty 11389 1726854874.10103: checking for any_errors_fatal 11389 1726854874.10121: done checking for any_errors_fatal 11389 1726854874.10122: checking for max_fail_percentage 11389 1726854874.10124: done checking for max_fail_percentage 11389 1726854874.10125: checking to see if all hosts have failed and the running result is not ok 11389 1726854874.10126: done checking to see if all hosts have failed 11389 1726854874.10126: getting the remaining hosts for this loop 11389 1726854874.10128: done getting the remaining hosts for this loop 11389 1726854874.10131: getting the next task for host managed_node3 11389 1726854874.10137: done getting next task for host managed_node3 11389 1726854874.10141: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11389 1726854874.10145: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854874.10162: getting variables 11389 1726854874.10164: in VariableManager get_vars() 11389 1726854874.10211: Calling all_inventory to load vars for managed_node3 11389 1726854874.10215: Calling groups_inventory to load vars for managed_node3 11389 1726854874.10217: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854874.10227: Calling all_plugins_play to load vars for managed_node3 11389 1726854874.10230: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854874.10233: Calling groups_plugins_play to load vars for managed_node3 11389 1726854874.12714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854874.14290: done with get_vars() 11389 1726854874.14315: done getting variables 11389 1726854874.14379: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:54:34 -0400 (0:00:00.134) 0:00:26.567 ****** 11389 1726854874.14417: entering _queue_task() for managed_node3/service 11389 1726854874.14763: worker is 1 (out of 1 available) 11389 1726854874.14779: exiting _queue_task() for managed_node3/service 11389 1726854874.14992: done queuing things up, now waiting for results queue to drain 11389 1726854874.14994: waiting for pending results... 11389 1726854874.15076: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 11389 1726854874.15239: in run() - task 0affcc66-ac2b-deb8-c119-00000000008a 11389 1726854874.15256: variable 'ansible_search_path' from source: unknown 11389 1726854874.15263: variable 'ansible_search_path' from source: unknown 11389 1726854874.15305: calling self._execute() 11389 1726854874.15408: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854874.15418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854874.15435: variable 'omit' from source: magic vars 11389 1726854874.15806: variable 'ansible_distribution_major_version' from source: facts 11389 1726854874.15823: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854874.15945: variable 'network_provider' from source: set_fact 11389 1726854874.15957: Evaluated conditional (network_provider == "initscripts"): False 11389 1726854874.15964: when evaluation is False, skipping this task 11389 1726854874.15978: _execute() done 11389 1726854874.16085: dumping result to json 11389 1726854874.16090: done dumping result, returning 11389 1726854874.16093: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-deb8-c119-00000000008a] 11389 1726854874.16095: sending task result for task 0affcc66-ac2b-deb8-c119-00000000008a 11389 1726854874.16164: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000008a 11389 1726854874.16170: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11389 1726854874.16232: no more pending results, returning what we have 11389 1726854874.16236: results queue empty 11389 1726854874.16237: checking for any_errors_fatal 11389 1726854874.16246: done checking for any_errors_fatal 11389 1726854874.16247: checking for max_fail_percentage 11389 1726854874.16249: done checking for max_fail_percentage 11389 1726854874.16251: checking to see if all hosts have failed and the running result is not ok 11389 1726854874.16252: done checking to see if all hosts have failed 11389 1726854874.16252: getting the remaining hosts for this loop 11389 1726854874.16255: done getting the remaining hosts for this loop 11389 1726854874.16258: getting the next task for host managed_node3 11389 1726854874.16268: done getting next task for host managed_node3 11389 1726854874.16272: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11389 1726854874.16276: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854874.16297: getting variables 11389 1726854874.16299: in VariableManager get_vars() 11389 1726854874.16339: Calling all_inventory to load vars for managed_node3 11389 1726854874.16341: Calling groups_inventory to load vars for managed_node3 11389 1726854874.16344: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854874.16356: Calling all_plugins_play to load vars for managed_node3 11389 1726854874.16359: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854874.16362: Calling groups_plugins_play to load vars for managed_node3 11389 1726854874.17890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854874.19583: done with get_vars() 11389 1726854874.19608: done getting variables 11389 1726854874.19678: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:54:34 -0400 (0:00:00.052) 0:00:26.620 ****** 11389 1726854874.19717: entering _queue_task() for managed_node3/copy 11389 1726854874.20062: worker is 1 (out of 1 available) 11389 1726854874.20077: exiting _queue_task() for managed_node3/copy 11389 1726854874.20091: done queuing things up, now waiting for results queue to drain 11389 1726854874.20092: waiting for pending results... 11389 1726854874.20384: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11389 1726854874.20606: in run() - task 0affcc66-ac2b-deb8-c119-00000000008b 11389 1726854874.20609: variable 'ansible_search_path' from source: unknown 11389 1726854874.20613: variable 'ansible_search_path' from source: unknown 11389 1726854874.20616: calling self._execute() 11389 1726854874.20720: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854874.20732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854874.20746: variable 'omit' from source: magic vars 11389 1726854874.21130: variable 'ansible_distribution_major_version' from source: facts 11389 1726854874.21152: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854874.21370: variable 'network_provider' from source: set_fact 11389 1726854874.21374: Evaluated conditional (network_provider == "initscripts"): False 11389 1726854874.21376: when evaluation is False, skipping this task 11389 1726854874.21378: _execute() done 11389 1726854874.21380: dumping result to json 11389 1726854874.21382: done dumping result, returning 11389 1726854874.21385: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-deb8-c119-00000000008b] 11389 1726854874.21388: sending task result for task 0affcc66-ac2b-deb8-c119-00000000008b 11389 1726854874.21462: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000008b 11389 1726854874.21468: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11389 1726854874.21641: no more pending results, returning what we have 11389 1726854874.21645: results queue empty 11389 1726854874.21645: checking for any_errors_fatal 11389 1726854874.21652: done checking for any_errors_fatal 11389 1726854874.21653: checking for max_fail_percentage 11389 1726854874.21656: done checking for max_fail_percentage 11389 1726854874.21657: checking to see if all hosts have failed and the running result is not ok 11389 1726854874.21658: done checking to see if all hosts have failed 11389 1726854874.21659: getting the remaining hosts for this loop 11389 1726854874.21661: done getting the remaining hosts for this loop 11389 1726854874.21664: getting the next task for host managed_node3 11389 1726854874.21675: done getting next task for host managed_node3 11389 1726854874.21679: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11389 1726854874.21684: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854874.21706: getting variables 11389 1726854874.21708: in VariableManager get_vars() 11389 1726854874.21743: Calling all_inventory to load vars for managed_node3 11389 1726854874.21745: Calling groups_inventory to load vars for managed_node3 11389 1726854874.21748: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854874.21758: Calling all_plugins_play to load vars for managed_node3 11389 1726854874.21761: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854874.21764: Calling groups_plugins_play to load vars for managed_node3 11389 1726854874.23198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854874.24810: done with get_vars() 11389 1726854874.24838: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:54:34 -0400 (0:00:00.052) 0:00:26.672 ****** 11389 1726854874.24939: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11389 1726854874.25301: worker is 1 (out of 1 available) 11389 1726854874.25313: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11389 1726854874.25324: done queuing things up, now waiting for results queue to drain 11389 1726854874.25325: waiting for pending results... 11389 1726854874.25614: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11389 1726854874.25790: in run() - task 0affcc66-ac2b-deb8-c119-00000000008c 11389 1726854874.25816: variable 'ansible_search_path' from source: unknown 11389 1726854874.25824: variable 'ansible_search_path' from source: unknown 11389 1726854874.25875: calling self._execute() 11389 1726854874.25984: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854874.25997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854874.26058: variable 'omit' from source: magic vars 11389 1726854874.26425: variable 'ansible_distribution_major_version' from source: facts 11389 1726854874.26444: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854874.26457: variable 'omit' from source: magic vars 11389 1726854874.26537: variable 'omit' from source: magic vars 11389 1726854874.26716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11389 1726854874.28903: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11389 1726854874.29003: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11389 1726854874.29025: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11389 1726854874.29069: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11389 1726854874.29288: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11389 1726854874.29294: variable 'network_provider' from source: set_fact 11389 1726854874.29340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11389 1726854874.29394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11389 1726854874.29427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11389 1726854874.29477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11389 1726854874.29499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11389 1726854874.29584: variable 'omit' from source: magic vars 11389 1726854874.29715: variable 'omit' from source: magic vars 11389 1726854874.29834: variable 'network_connections' from source: task vars 11389 1726854874.29856: variable 'port2_profile' from source: play vars 11389 1726854874.29927: variable 'port2_profile' from source: play vars 11389 1726854874.29951: variable 'port1_profile' from source: play vars 11389 1726854874.30014: variable 'port1_profile' from source: play vars 11389 1726854874.30061: variable 'controller_profile' from source: play vars 11389 1726854874.30102: variable 'controller_profile' from source: play vars 11389 1726854874.30562: variable 'omit' from source: magic vars 11389 1726854874.30581: variable '__lsr_ansible_managed' from source: task vars 11389 1726854874.30646: variable '__lsr_ansible_managed' from source: task vars 11389 1726854874.30832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11389 1726854874.31380: Loaded config def from plugin (lookup/template) 11389 1726854874.31393: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11389 1726854874.31428: File lookup term: get_ansible_managed.j2 11389 1726854874.31435: variable 'ansible_search_path' from source: unknown 11389 1726854874.31445: evaluation_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11389 1726854874.31513: search_path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11389 1726854874.31516: variable 'ansible_search_path' from source: unknown 11389 1726854874.37549: variable 'ansible_managed' from source: unknown 11389 1726854874.37695: variable 'omit' from source: magic vars 11389 1726854874.37728: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854874.37762: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854874.37786: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854874.37861: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854874.37864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854874.37869: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854874.37871: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854874.37873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854874.37961: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854874.37981: Set connection var ansible_timeout to 10 11389 1726854874.37989: Set connection var ansible_connection to ssh 11389 1726854874.37999: Set connection var ansible_shell_type to sh 11389 1726854874.38007: Set connection var ansible_pipelining to False 11389 1726854874.38015: Set connection var ansible_shell_executable to /bin/sh 11389 1726854874.38039: variable 'ansible_shell_executable' from source: unknown 11389 1726854874.38079: variable 'ansible_connection' from source: unknown 11389 1726854874.38082: variable 'ansible_module_compression' from source: unknown 11389 1726854874.38084: variable 'ansible_shell_type' from source: unknown 11389 1726854874.38086: variable 'ansible_shell_executable' from source: unknown 11389 1726854874.38089: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854874.38091: variable 'ansible_pipelining' from source: unknown 11389 1726854874.38093: variable 'ansible_timeout' from source: unknown 11389 1726854874.38103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854874.38385: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11389 1726854874.38390: variable 'omit' from source: magic vars 11389 1726854874.38393: starting attempt loop 11389 1726854874.38395: running the handler 11389 1726854874.38397: _low_level_execute_command(): starting 11389 1726854874.38399: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854874.39707: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854874.39938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854874.39999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854874.41690: stdout chunk (state=3): >>>/root <<< 11389 1726854874.41843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854874.41846: stdout chunk (state=3): >>><<< 11389 1726854874.41849: stderr chunk (state=3): >>><<< 11389 1726854874.41968: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854874.41972: _low_level_execute_command(): starting 11389 1726854874.41976: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854874.4187453-12683-208054021024738 `" && echo ansible-tmp-1726854874.4187453-12683-208054021024738="` echo /root/.ansible/tmp/ansible-tmp-1726854874.4187453-12683-208054021024738 `" ) && sleep 0' 11389 1726854874.43094: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854874.43099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854874.43114: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854874.43118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854874.43302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854874.43419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854874.45386: stdout chunk (state=3): >>>ansible-tmp-1726854874.4187453-12683-208054021024738=/root/.ansible/tmp/ansible-tmp-1726854874.4187453-12683-208054021024738 <<< 11389 1726854874.45510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854874.45561: stderr chunk (state=3): >>><<< 11389 1726854874.45578: stdout chunk (state=3): >>><<< 11389 1726854874.45604: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854874.4187453-12683-208054021024738=/root/.ansible/tmp/ansible-tmp-1726854874.4187453-12683-208054021024738 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854874.45662: variable 'ansible_module_compression' from source: unknown 11389 1726854874.45718: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 11389 1726854874.45786: variable 'ansible_facts' from source: unknown 11389 1726854874.45945: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854874.4187453-12683-208054021024738/AnsiballZ_network_connections.py 11389 1726854874.46176: Sending initial data 11389 1726854874.46182: Sent initial data (168 bytes) 11389 1726854874.46742: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854874.46756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854874.46767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854874.46792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854874.46825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854874.46913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854874.46943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854874.47036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854874.48593: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854874.48674: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854874.48727: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmp83nkn9ch /root/.ansible/tmp/ansible-tmp-1726854874.4187453-12683-208054021024738/AnsiballZ_network_connections.py <<< 11389 1726854874.48751: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854874.4187453-12683-208054021024738/AnsiballZ_network_connections.py" <<< 11389 1726854874.48818: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmp83nkn9ch" to remote "/root/.ansible/tmp/ansible-tmp-1726854874.4187453-12683-208054021024738/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854874.4187453-12683-208054021024738/AnsiballZ_network_connections.py" <<< 11389 1726854874.49915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854874.49954: stderr chunk (state=3): >>><<< 11389 1726854874.50013: stdout chunk (state=3): >>><<< 11389 1726854874.50021: done transferring module to remote 11389 1726854874.50042: _low_level_execute_command(): starting 11389 1726854874.50052: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854874.4187453-12683-208054021024738/ /root/.ansible/tmp/ansible-tmp-1726854874.4187453-12683-208054021024738/AnsiballZ_network_connections.py && sleep 0' 11389 1726854874.50741: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854874.50754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854874.50774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854874.50848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854874.50927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854874.50944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854874.50976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854874.51128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854874.52950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854874.52976: stdout chunk (state=3): >>><<< 11389 1726854874.52979: stderr chunk (state=3): >>><<< 11389 1726854874.53073: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854874.53077: _low_level_execute_command(): starting 11389 1726854874.53079: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854874.4187453-12683-208054021024738/AnsiballZ_network_connections.py && sleep 0' 11389 1726854874.53614: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854874.53638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854874.53668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854874.53762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854874.53818: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854874.53841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854874.53944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854874.54106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854875.07489: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yo6p8mik/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yo6p8mik/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/95858a33-49bf-46bb-9b62-ad79a5b8ae35: error=unknown <<< 11389 1726854875.09369: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yo6p8mik/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yo6p8mik/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/4a57b26e-7a43-4b52-89b8-337f92ac6d2d: error=unknown<<< 11389 1726854875.09383: stdout chunk (state=3): >>> <<< 11389 1726854875.11016: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yo6p8mik/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yo6p8mik/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/7e031d4d-65d1-4bb1-88e6-3f83b0d194c1: error=unknown<<< 11389 1726854875.11169: stdout chunk (state=3): >>> <<< 11389 1726854875.11305: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11389 1726854875.13300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854875.13304: stdout chunk (state=3): >>><<< 11389 1726854875.13306: stderr chunk (state=3): >>><<< 11389 1726854875.13367: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yo6p8mik/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yo6p8mik/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/95858a33-49bf-46bb-9b62-ad79a5b8ae35: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yo6p8mik/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yo6p8mik/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/4a57b26e-7a43-4b52-89b8-337f92ac6d2d: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yo6p8mik/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yo6p8mik/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/7e031d4d-65d1-4bb1-88e6-3f83b0d194c1: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854875.13403: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854874.4187453-12683-208054021024738/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854875.13417: _low_level_execute_command(): starting 11389 1726854875.13475: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854874.4187453-12683-208054021024738/ > /dev/null 2>&1 && sleep 0' 11389 1726854875.14094: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854875.14098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854875.14101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854875.14110: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854875.14165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854875.14189: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854875.14206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854875.14321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854875.16394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854875.16398: stderr chunk (state=3): >>><<< 11389 1726854875.16400: stdout chunk (state=3): >>><<< 11389 1726854875.16402: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854875.16405: handler run complete 11389 1726854875.16407: attempt loop complete, returning result 11389 1726854875.16408: _execute() done 11389 1726854875.16410: dumping result to json 11389 1726854875.16412: done dumping result, returning 11389 1726854875.16414: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-deb8-c119-00000000008c] 11389 1726854875.16416: sending task result for task 0affcc66-ac2b-deb8-c119-00000000008c 11389 1726854875.16494: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000008c 11389 1726854875.16498: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 11389 1726854875.16831: no more pending results, returning what we have 11389 1726854875.16836: results queue empty 11389 1726854875.16837: checking for any_errors_fatal 11389 1726854875.16844: done checking for any_errors_fatal 11389 1726854875.16844: checking for max_fail_percentage 11389 1726854875.16847: done checking for max_fail_percentage 11389 1726854875.16848: checking to see if all hosts have failed and the running result is not ok 11389 1726854875.16848: done checking to see if all hosts have failed 11389 1726854875.16849: getting the remaining hosts for this loop 11389 1726854875.16851: done getting the remaining hosts for this loop 11389 1726854875.16854: getting the next task for host managed_node3 11389 1726854875.16861: done getting next task for host managed_node3 11389 1726854875.16868: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11389 1726854875.16872: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854875.16884: getting variables 11389 1726854875.16886: in VariableManager get_vars() 11389 1726854875.16932: Calling all_inventory to load vars for managed_node3 11389 1726854875.16935: Calling groups_inventory to load vars for managed_node3 11389 1726854875.16937: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854875.16948: Calling all_plugins_play to load vars for managed_node3 11389 1726854875.16955: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854875.16959: Calling groups_plugins_play to load vars for managed_node3 11389 1726854875.18781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854875.20492: done with get_vars() 11389 1726854875.20517: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:54:35 -0400 (0:00:00.956) 0:00:27.629 ****** 11389 1726854875.20607: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11389 1726854875.21117: worker is 1 (out of 1 available) 11389 1726854875.21127: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11389 1726854875.21136: done queuing things up, now waiting for results queue to drain 11389 1726854875.21138: waiting for pending results... 11389 1726854875.21258: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 11389 1726854875.21424: in run() - task 0affcc66-ac2b-deb8-c119-00000000008d 11389 1726854875.21445: variable 'ansible_search_path' from source: unknown 11389 1726854875.21455: variable 'ansible_search_path' from source: unknown 11389 1726854875.21509: calling self._execute() 11389 1726854875.21618: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854875.21630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854875.21695: variable 'omit' from source: magic vars 11389 1726854875.22048: variable 'ansible_distribution_major_version' from source: facts 11389 1726854875.22065: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854875.22202: variable 'network_state' from source: role '' defaults 11389 1726854875.22239: Evaluated conditional (network_state != {}): False 11389 1726854875.22246: when evaluation is False, skipping this task 11389 1726854875.22249: _execute() done 11389 1726854875.22251: dumping result to json 11389 1726854875.22255: done dumping result, returning 11389 1726854875.22293: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-deb8-c119-00000000008d] 11389 1726854875.22297: sending task result for task 0affcc66-ac2b-deb8-c119-00000000008d 11389 1726854875.22503: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000008d 11389 1726854875.22507: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11389 1726854875.22681: no more pending results, returning what we have 11389 1726854875.22685: results queue empty 11389 1726854875.22685: checking for any_errors_fatal 11389 1726854875.22697: done checking for any_errors_fatal 11389 1726854875.22698: checking for max_fail_percentage 11389 1726854875.22700: done checking for max_fail_percentage 11389 1726854875.22701: checking to see if all hosts have failed and the running result is not ok 11389 1726854875.22702: done checking to see if all hosts have failed 11389 1726854875.22703: getting the remaining hosts for this loop 11389 1726854875.22704: done getting the remaining hosts for this loop 11389 1726854875.22707: getting the next task for host managed_node3 11389 1726854875.22714: done getting next task for host managed_node3 11389 1726854875.22717: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11389 1726854875.22721: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854875.22738: getting variables 11389 1726854875.22740: in VariableManager get_vars() 11389 1726854875.22894: Calling all_inventory to load vars for managed_node3 11389 1726854875.22897: Calling groups_inventory to load vars for managed_node3 11389 1726854875.22900: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854875.22909: Calling all_plugins_play to load vars for managed_node3 11389 1726854875.22912: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854875.22915: Calling groups_plugins_play to load vars for managed_node3 11389 1726854875.24297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854875.25953: done with get_vars() 11389 1726854875.25982: done getting variables 11389 1726854875.26044: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:54:35 -0400 (0:00:00.054) 0:00:27.683 ****** 11389 1726854875.26090: entering _queue_task() for managed_node3/debug 11389 1726854875.26421: worker is 1 (out of 1 available) 11389 1726854875.26436: exiting _queue_task() for managed_node3/debug 11389 1726854875.26447: done queuing things up, now waiting for results queue to drain 11389 1726854875.26449: waiting for pending results... 11389 1726854875.26629: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11389 1726854875.26720: in run() - task 0affcc66-ac2b-deb8-c119-00000000008e 11389 1726854875.26733: variable 'ansible_search_path' from source: unknown 11389 1726854875.26736: variable 'ansible_search_path' from source: unknown 11389 1726854875.26764: calling self._execute() 11389 1726854875.26836: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854875.26845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854875.26880: variable 'omit' from source: magic vars 11389 1726854875.27139: variable 'ansible_distribution_major_version' from source: facts 11389 1726854875.27149: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854875.27155: variable 'omit' from source: magic vars 11389 1726854875.27202: variable 'omit' from source: magic vars 11389 1726854875.27229: variable 'omit' from source: magic vars 11389 1726854875.27263: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854875.27292: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854875.27309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854875.27323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854875.27333: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854875.27356: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854875.27360: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854875.27363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854875.27433: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854875.27440: Set connection var ansible_timeout to 10 11389 1726854875.27442: Set connection var ansible_connection to ssh 11389 1726854875.27445: Set connection var ansible_shell_type to sh 11389 1726854875.27451: Set connection var ansible_pipelining to False 11389 1726854875.27456: Set connection var ansible_shell_executable to /bin/sh 11389 1726854875.27476: variable 'ansible_shell_executable' from source: unknown 11389 1726854875.27479: variable 'ansible_connection' from source: unknown 11389 1726854875.27482: variable 'ansible_module_compression' from source: unknown 11389 1726854875.27484: variable 'ansible_shell_type' from source: unknown 11389 1726854875.27488: variable 'ansible_shell_executable' from source: unknown 11389 1726854875.27490: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854875.27493: variable 'ansible_pipelining' from source: unknown 11389 1726854875.27495: variable 'ansible_timeout' from source: unknown 11389 1726854875.27500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854875.27596: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854875.27609: variable 'omit' from source: magic vars 11389 1726854875.27612: starting attempt loop 11389 1726854875.27615: running the handler 11389 1726854875.27705: variable '__network_connections_result' from source: set_fact 11389 1726854875.27744: handler run complete 11389 1726854875.27757: attempt loop complete, returning result 11389 1726854875.27762: _execute() done 11389 1726854875.27764: dumping result to json 11389 1726854875.27770: done dumping result, returning 11389 1726854875.27776: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-deb8-c119-00000000008e] 11389 1726854875.27781: sending task result for task 0affcc66-ac2b-deb8-c119-00000000008e 11389 1726854875.27902: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000008e 11389 1726854875.27905: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 11389 1726854875.27969: no more pending results, returning what we have 11389 1726854875.27972: results queue empty 11389 1726854875.27973: checking for any_errors_fatal 11389 1726854875.27978: done checking for any_errors_fatal 11389 1726854875.27979: checking for max_fail_percentage 11389 1726854875.27981: done checking for max_fail_percentage 11389 1726854875.27982: checking to see if all hosts have failed and the running result is not ok 11389 1726854875.27983: done checking to see if all hosts have failed 11389 1726854875.27984: getting the remaining hosts for this loop 11389 1726854875.27985: done getting the remaining hosts for this loop 11389 1726854875.27990: getting the next task for host managed_node3 11389 1726854875.27996: done getting next task for host managed_node3 11389 1726854875.28000: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11389 1726854875.28004: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854875.28014: getting variables 11389 1726854875.28016: in VariableManager get_vars() 11389 1726854875.28153: Calling all_inventory to load vars for managed_node3 11389 1726854875.28156: Calling groups_inventory to load vars for managed_node3 11389 1726854875.28158: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854875.28166: Calling all_plugins_play to load vars for managed_node3 11389 1726854875.28168: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854875.28171: Calling groups_plugins_play to load vars for managed_node3 11389 1726854875.29126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854875.30185: done with get_vars() 11389 1726854875.30205: done getting variables 11389 1726854875.30247: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:54:35 -0400 (0:00:00.041) 0:00:27.725 ****** 11389 1726854875.30275: entering _queue_task() for managed_node3/debug 11389 1726854875.30522: worker is 1 (out of 1 available) 11389 1726854875.30536: exiting _queue_task() for managed_node3/debug 11389 1726854875.30546: done queuing things up, now waiting for results queue to drain 11389 1726854875.30548: waiting for pending results... 11389 1726854875.30733: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11389 1726854875.30835: in run() - task 0affcc66-ac2b-deb8-c119-00000000008f 11389 1726854875.30846: variable 'ansible_search_path' from source: unknown 11389 1726854875.30850: variable 'ansible_search_path' from source: unknown 11389 1726854875.30882: calling self._execute() 11389 1726854875.30959: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854875.30963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854875.30972: variable 'omit' from source: magic vars 11389 1726854875.31245: variable 'ansible_distribution_major_version' from source: facts 11389 1726854875.31255: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854875.31261: variable 'omit' from source: magic vars 11389 1726854875.31307: variable 'omit' from source: magic vars 11389 1726854875.31335: variable 'omit' from source: magic vars 11389 1726854875.31368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854875.31397: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854875.31414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854875.31429: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854875.31438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854875.31463: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854875.31468: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854875.31471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854875.31539: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854875.31544: Set connection var ansible_timeout to 10 11389 1726854875.31547: Set connection var ansible_connection to ssh 11389 1726854875.31552: Set connection var ansible_shell_type to sh 11389 1726854875.31557: Set connection var ansible_pipelining to False 11389 1726854875.31562: Set connection var ansible_shell_executable to /bin/sh 11389 1726854875.31579: variable 'ansible_shell_executable' from source: unknown 11389 1726854875.31581: variable 'ansible_connection' from source: unknown 11389 1726854875.31584: variable 'ansible_module_compression' from source: unknown 11389 1726854875.31587: variable 'ansible_shell_type' from source: unknown 11389 1726854875.31590: variable 'ansible_shell_executable' from source: unknown 11389 1726854875.31593: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854875.31597: variable 'ansible_pipelining' from source: unknown 11389 1726854875.31599: variable 'ansible_timeout' from source: unknown 11389 1726854875.31603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854875.31703: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854875.31711: variable 'omit' from source: magic vars 11389 1726854875.31717: starting attempt loop 11389 1726854875.31720: running the handler 11389 1726854875.31758: variable '__network_connections_result' from source: set_fact 11389 1726854875.31813: variable '__network_connections_result' from source: set_fact 11389 1726854875.31900: handler run complete 11389 1726854875.31917: attempt loop complete, returning result 11389 1726854875.31920: _execute() done 11389 1726854875.31922: dumping result to json 11389 1726854875.31927: done dumping result, returning 11389 1726854875.31935: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-deb8-c119-00000000008f] 11389 1726854875.31940: sending task result for task 0affcc66-ac2b-deb8-c119-00000000008f 11389 1726854875.32034: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000008f 11389 1726854875.32036: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 11389 1726854875.32123: no more pending results, returning what we have 11389 1726854875.32126: results queue empty 11389 1726854875.32127: checking for any_errors_fatal 11389 1726854875.32132: done checking for any_errors_fatal 11389 1726854875.32133: checking for max_fail_percentage 11389 1726854875.32135: done checking for max_fail_percentage 11389 1726854875.32136: checking to see if all hosts have failed and the running result is not ok 11389 1726854875.32137: done checking to see if all hosts have failed 11389 1726854875.32137: getting the remaining hosts for this loop 11389 1726854875.32138: done getting the remaining hosts for this loop 11389 1726854875.32141: getting the next task for host managed_node3 11389 1726854875.32149: done getting next task for host managed_node3 11389 1726854875.32153: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11389 1726854875.32156: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854875.32169: getting variables 11389 1726854875.32170: in VariableManager get_vars() 11389 1726854875.32202: Calling all_inventory to load vars for managed_node3 11389 1726854875.32205: Calling groups_inventory to load vars for managed_node3 11389 1726854875.32206: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854875.32214: Calling all_plugins_play to load vars for managed_node3 11389 1726854875.32222: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854875.32225: Calling groups_plugins_play to load vars for managed_node3 11389 1726854875.32978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854875.33837: done with get_vars() 11389 1726854875.33854: done getting variables 11389 1726854875.33909: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:54:35 -0400 (0:00:00.036) 0:00:27.762 ****** 11389 1726854875.33934: entering _queue_task() for managed_node3/debug 11389 1726854875.34177: worker is 1 (out of 1 available) 11389 1726854875.34192: exiting _queue_task() for managed_node3/debug 11389 1726854875.34204: done queuing things up, now waiting for results queue to drain 11389 1726854875.34206: waiting for pending results... 11389 1726854875.34409: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11389 1726854875.34505: in run() - task 0affcc66-ac2b-deb8-c119-000000000090 11389 1726854875.34516: variable 'ansible_search_path' from source: unknown 11389 1726854875.34519: variable 'ansible_search_path' from source: unknown 11389 1726854875.34547: calling self._execute() 11389 1726854875.34628: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854875.34633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854875.34644: variable 'omit' from source: magic vars 11389 1726854875.34901: variable 'ansible_distribution_major_version' from source: facts 11389 1726854875.34911: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854875.34991: variable 'network_state' from source: role '' defaults 11389 1726854875.35001: Evaluated conditional (network_state != {}): False 11389 1726854875.35005: when evaluation is False, skipping this task 11389 1726854875.35008: _execute() done 11389 1726854875.35010: dumping result to json 11389 1726854875.35013: done dumping result, returning 11389 1726854875.35020: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-deb8-c119-000000000090] 11389 1726854875.35026: sending task result for task 0affcc66-ac2b-deb8-c119-000000000090 11389 1726854875.35112: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000090 11389 1726854875.35115: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 11389 1726854875.35158: no more pending results, returning what we have 11389 1726854875.35161: results queue empty 11389 1726854875.35162: checking for any_errors_fatal 11389 1726854875.35175: done checking for any_errors_fatal 11389 1726854875.35175: checking for max_fail_percentage 11389 1726854875.35178: done checking for max_fail_percentage 11389 1726854875.35179: checking to see if all hosts have failed and the running result is not ok 11389 1726854875.35180: done checking to see if all hosts have failed 11389 1726854875.35180: getting the remaining hosts for this loop 11389 1726854875.35181: done getting the remaining hosts for this loop 11389 1726854875.35184: getting the next task for host managed_node3 11389 1726854875.35192: done getting next task for host managed_node3 11389 1726854875.35196: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11389 1726854875.35200: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854875.35217: getting variables 11389 1726854875.35218: in VariableManager get_vars() 11389 1726854875.35249: Calling all_inventory to load vars for managed_node3 11389 1726854875.35252: Calling groups_inventory to load vars for managed_node3 11389 1726854875.35254: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854875.35262: Calling all_plugins_play to load vars for managed_node3 11389 1726854875.35264: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854875.35269: Calling groups_plugins_play to load vars for managed_node3 11389 1726854875.40858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854875.42445: done with get_vars() 11389 1726854875.42473: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:54:35 -0400 (0:00:00.086) 0:00:27.848 ****** 11389 1726854875.42555: entering _queue_task() for managed_node3/ping 11389 1726854875.42907: worker is 1 (out of 1 available) 11389 1726854875.42921: exiting _queue_task() for managed_node3/ping 11389 1726854875.42931: done queuing things up, now waiting for results queue to drain 11389 1726854875.42933: waiting for pending results... 11389 1726854875.43307: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 11389 1726854875.43317: in run() - task 0affcc66-ac2b-deb8-c119-000000000091 11389 1726854875.43330: variable 'ansible_search_path' from source: unknown 11389 1726854875.43338: variable 'ansible_search_path' from source: unknown 11389 1726854875.43381: calling self._execute() 11389 1726854875.43483: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854875.43497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854875.43511: variable 'omit' from source: magic vars 11389 1726854875.43869: variable 'ansible_distribution_major_version' from source: facts 11389 1726854875.43885: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854875.43897: variable 'omit' from source: magic vars 11389 1726854875.43963: variable 'omit' from source: magic vars 11389 1726854875.44058: variable 'omit' from source: magic vars 11389 1726854875.44114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854875.44159: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854875.44199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854875.44222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854875.44276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854875.44302: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854875.44312: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854875.44320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854875.44471: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854875.44514: Set connection var ansible_timeout to 10 11389 1726854875.44527: Set connection var ansible_connection to ssh 11389 1726854875.44636: Set connection var ansible_shell_type to sh 11389 1726854875.44640: Set connection var ansible_pipelining to False 11389 1726854875.44643: Set connection var ansible_shell_executable to /bin/sh 11389 1726854875.44645: variable 'ansible_shell_executable' from source: unknown 11389 1726854875.44647: variable 'ansible_connection' from source: unknown 11389 1726854875.44650: variable 'ansible_module_compression' from source: unknown 11389 1726854875.44652: variable 'ansible_shell_type' from source: unknown 11389 1726854875.44655: variable 'ansible_shell_executable' from source: unknown 11389 1726854875.44658: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854875.44660: variable 'ansible_pipelining' from source: unknown 11389 1726854875.44662: variable 'ansible_timeout' from source: unknown 11389 1726854875.44664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854875.44851: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 11389 1726854875.44874: variable 'omit' from source: magic vars 11389 1726854875.44897: starting attempt loop 11389 1726854875.44975: running the handler 11389 1726854875.44978: _low_level_execute_command(): starting 11389 1726854875.44981: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854875.46136: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854875.46144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854875.46147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854875.46150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854875.46220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854875.47910: stdout chunk (state=3): >>>/root <<< 11389 1726854875.48010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854875.48091: stderr chunk (state=3): >>><<< 11389 1726854875.48094: stdout chunk (state=3): >>><<< 11389 1726854875.48216: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854875.48219: _low_level_execute_command(): starting 11389 1726854875.48223: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854875.4811761-12757-84248521122405 `" && echo ansible-tmp-1726854875.4811761-12757-84248521122405="` echo /root/.ansible/tmp/ansible-tmp-1726854875.4811761-12757-84248521122405 `" ) && sleep 0' 11389 1726854875.48778: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854875.48831: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854875.48871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854875.48970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854875.49014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854875.49128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854875.51042: stdout chunk (state=3): >>>ansible-tmp-1726854875.4811761-12757-84248521122405=/root/.ansible/tmp/ansible-tmp-1726854875.4811761-12757-84248521122405 <<< 11389 1726854875.51140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854875.51190: stderr chunk (state=3): >>><<< 11389 1726854875.51194: stdout chunk (state=3): >>><<< 11389 1726854875.51393: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854875.4811761-12757-84248521122405=/root/.ansible/tmp/ansible-tmp-1726854875.4811761-12757-84248521122405 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854875.51398: variable 'ansible_module_compression' from source: unknown 11389 1726854875.51401: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 11389 1726854875.51403: variable 'ansible_facts' from source: unknown 11389 1726854875.51443: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854875.4811761-12757-84248521122405/AnsiballZ_ping.py 11389 1726854875.51680: Sending initial data 11389 1726854875.51693: Sent initial data (152 bytes) 11389 1726854875.52224: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854875.52239: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854875.52257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854875.52298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854875.52398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854875.52422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854875.52500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854875.54080: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854875.54133: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854875.54201: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmpono3kfrq /root/.ansible/tmp/ansible-tmp-1726854875.4811761-12757-84248521122405/AnsiballZ_ping.py <<< 11389 1726854875.54211: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854875.4811761-12757-84248521122405/AnsiballZ_ping.py" <<< 11389 1726854875.54259: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmpono3kfrq" to remote "/root/.ansible/tmp/ansible-tmp-1726854875.4811761-12757-84248521122405/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854875.4811761-12757-84248521122405/AnsiballZ_ping.py" <<< 11389 1726854875.55095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854875.55108: stdout chunk (state=3): >>><<< 11389 1726854875.55119: stderr chunk (state=3): >>><<< 11389 1726854875.55147: done transferring module to remote 11389 1726854875.55162: _low_level_execute_command(): starting 11389 1726854875.55171: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854875.4811761-12757-84248521122405/ /root/.ansible/tmp/ansible-tmp-1726854875.4811761-12757-84248521122405/AnsiballZ_ping.py && sleep 0' 11389 1726854875.55827: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854875.55847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854875.55955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854875.55975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854875.55991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854875.56149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854875.57947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854875.57964: stdout chunk (state=3): >>><<< 11389 1726854875.57978: stderr chunk (state=3): >>><<< 11389 1726854875.58003: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854875.58090: _low_level_execute_command(): starting 11389 1726854875.58097: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854875.4811761-12757-84248521122405/AnsiballZ_ping.py && sleep 0' 11389 1726854875.58663: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854875.58679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854875.58694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854875.58710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854875.58724: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854875.58758: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854875.58824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854875.58841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854875.58865: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854875.58967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854875.73695: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11389 1726854875.75495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854875.75500: stdout chunk (state=3): >>><<< 11389 1726854875.75503: stderr chunk (state=3): >>><<< 11389 1726854875.75506: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854875.75509: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854875.4811761-12757-84248521122405/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854875.75511: _low_level_execute_command(): starting 11389 1726854875.75513: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854875.4811761-12757-84248521122405/ > /dev/null 2>&1 && sleep 0' 11389 1726854875.76422: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854875.76436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854875.76448: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854875.76654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854875.76671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854875.76740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854875.78795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854875.78799: stdout chunk (state=3): >>><<< 11389 1726854875.78801: stderr chunk (state=3): >>><<< 11389 1726854875.78804: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854875.78811: handler run complete 11389 1726854875.78814: attempt loop complete, returning result 11389 1726854875.78824: _execute() done 11389 1726854875.78827: dumping result to json 11389 1726854875.78829: done dumping result, returning 11389 1726854875.78831: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-deb8-c119-000000000091] 11389 1726854875.78834: sending task result for task 0affcc66-ac2b-deb8-c119-000000000091 11389 1726854875.78903: done sending task result for task 0affcc66-ac2b-deb8-c119-000000000091 11389 1726854875.78907: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 11389 1726854875.78999: no more pending results, returning what we have 11389 1726854875.79003: results queue empty 11389 1726854875.79004: checking for any_errors_fatal 11389 1726854875.79012: done checking for any_errors_fatal 11389 1726854875.79013: checking for max_fail_percentage 11389 1726854875.79016: done checking for max_fail_percentage 11389 1726854875.79017: checking to see if all hosts have failed and the running result is not ok 11389 1726854875.79018: done checking to see if all hosts have failed 11389 1726854875.79018: getting the remaining hosts for this loop 11389 1726854875.79020: done getting the remaining hosts for this loop 11389 1726854875.79023: getting the next task for host managed_node3 11389 1726854875.79039: done getting next task for host managed_node3 11389 1726854875.79042: ^ task is: TASK: meta (role_complete) 11389 1726854875.79046: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854875.79061: getting variables 11389 1726854875.79063: in VariableManager get_vars() 11389 1726854875.79174: Calling all_inventory to load vars for managed_node3 11389 1726854875.79177: Calling groups_inventory to load vars for managed_node3 11389 1726854875.79180: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854875.79192: Calling all_plugins_play to load vars for managed_node3 11389 1726854875.79195: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854875.79198: Calling groups_plugins_play to load vars for managed_node3 11389 1726854875.82661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854875.87103: done with get_vars() 11389 1726854875.87128: done getting variables 11389 1726854875.87424: done queuing things up, now waiting for results queue to drain 11389 1726854875.87426: results queue empty 11389 1726854875.87427: checking for any_errors_fatal 11389 1726854875.87430: done checking for any_errors_fatal 11389 1726854875.87430: checking for max_fail_percentage 11389 1726854875.87431: done checking for max_fail_percentage 11389 1726854875.87432: checking to see if all hosts have failed and the running result is not ok 11389 1726854875.87433: done checking to see if all hosts have failed 11389 1726854875.87434: getting the remaining hosts for this loop 11389 1726854875.87434: done getting the remaining hosts for this loop 11389 1726854875.87437: getting the next task for host managed_node3 11389 1726854875.87441: done getting next task for host managed_node3 11389 1726854875.87443: ^ task is: TASK: Delete the device '{{ controller_device }}' 11389 1726854875.87445: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854875.87447: getting variables 11389 1726854875.87448: in VariableManager get_vars() 11389 1726854875.87464: Calling all_inventory to load vars for managed_node3 11389 1726854875.87468: Calling groups_inventory to load vars for managed_node3 11389 1726854875.87470: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854875.87475: Calling all_plugins_play to load vars for managed_node3 11389 1726854875.87477: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854875.87480: Calling groups_plugins_play to load vars for managed_node3 11389 1726854875.89862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854875.94209: done with get_vars() 11389 1726854875.94237: done getting variables 11389 1726854875.94286: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 11389 1726854875.94819: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:114 Friday 20 September 2024 13:54:35 -0400 (0:00:00.522) 0:00:28.371 ****** 11389 1726854875.94850: entering _queue_task() for managed_node3/command 11389 1726854875.96027: worker is 1 (out of 1 available) 11389 1726854875.96037: exiting _queue_task() for managed_node3/command 11389 1726854875.96048: done queuing things up, now waiting for results queue to drain 11389 1726854875.96049: waiting for pending results... 11389 1726854875.96403: running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' 11389 1726854875.96409: in run() - task 0affcc66-ac2b-deb8-c119-0000000000c1 11389 1726854875.96426: variable 'ansible_search_path' from source: unknown 11389 1726854875.96465: calling self._execute() 11389 1726854875.96893: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854875.96897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854875.96900: variable 'omit' from source: magic vars 11389 1726854875.97383: variable 'ansible_distribution_major_version' from source: facts 11389 1726854875.97696: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854875.97700: variable 'omit' from source: magic vars 11389 1726854875.97702: variable 'omit' from source: magic vars 11389 1726854875.97739: variable 'controller_device' from source: play vars 11389 1726854875.97765: variable 'omit' from source: magic vars 11389 1726854875.98092: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854875.98096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854875.98098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854875.98117: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854875.98134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854875.98169: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854875.98179: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854875.98190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854875.98693: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854875.98697: Set connection var ansible_timeout to 10 11389 1726854875.98699: Set connection var ansible_connection to ssh 11389 1726854875.98701: Set connection var ansible_shell_type to sh 11389 1726854875.98703: Set connection var ansible_pipelining to False 11389 1726854875.98705: Set connection var ansible_shell_executable to /bin/sh 11389 1726854875.98707: variable 'ansible_shell_executable' from source: unknown 11389 1726854875.98709: variable 'ansible_connection' from source: unknown 11389 1726854875.98712: variable 'ansible_module_compression' from source: unknown 11389 1726854875.98713: variable 'ansible_shell_type' from source: unknown 11389 1726854875.98715: variable 'ansible_shell_executable' from source: unknown 11389 1726854875.98717: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854875.98719: variable 'ansible_pipelining' from source: unknown 11389 1726854875.98721: variable 'ansible_timeout' from source: unknown 11389 1726854875.98723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854875.98801: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854875.99166: variable 'omit' from source: magic vars 11389 1726854875.99169: starting attempt loop 11389 1726854875.99171: running the handler 11389 1726854875.99174: _low_level_execute_command(): starting 11389 1726854875.99175: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854876.00421: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854876.00447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854876.00464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854876.00560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854876.02248: stdout chunk (state=3): >>>/root <<< 11389 1726854876.02378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854876.02395: stdout chunk (state=3): >>><<< 11389 1726854876.02409: stderr chunk (state=3): >>><<< 11389 1726854876.02437: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854876.02458: _low_level_execute_command(): starting 11389 1726854876.02471: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854876.0244498-12797-34869013605290 `" && echo ansible-tmp-1726854876.0244498-12797-34869013605290="` echo /root/.ansible/tmp/ansible-tmp-1726854876.0244498-12797-34869013605290 `" ) && sleep 0' 11389 1726854876.03528: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854876.03795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854876.04026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854876.04075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854876.06041: stdout chunk (state=3): >>>ansible-tmp-1726854876.0244498-12797-34869013605290=/root/.ansible/tmp/ansible-tmp-1726854876.0244498-12797-34869013605290 <<< 11389 1726854876.06366: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854876.06379: stdout chunk (state=3): >>><<< 11389 1726854876.06395: stderr chunk (state=3): >>><<< 11389 1726854876.06417: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854876.0244498-12797-34869013605290=/root/.ansible/tmp/ansible-tmp-1726854876.0244498-12797-34869013605290 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854876.06453: variable 'ansible_module_compression' from source: unknown 11389 1726854876.06512: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11389 1726854876.06731: variable 'ansible_facts' from source: unknown 11389 1726854876.06824: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854876.0244498-12797-34869013605290/AnsiballZ_command.py 11389 1726854876.07239: Sending initial data 11389 1726854876.07242: Sent initial data (155 bytes) 11389 1726854876.08460: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854876.08736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854876.08782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854876.10716: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854876.10757: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854876.10820: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmphu65osnf /root/.ansible/tmp/ansible-tmp-1726854876.0244498-12797-34869013605290/AnsiballZ_command.py <<< 11389 1726854876.10831: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854876.0244498-12797-34869013605290/AnsiballZ_command.py" <<< 11389 1726854876.10882: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmphu65osnf" to remote "/root/.ansible/tmp/ansible-tmp-1726854876.0244498-12797-34869013605290/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854876.0244498-12797-34869013605290/AnsiballZ_command.py" <<< 11389 1726854876.12395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854876.12399: stdout chunk (state=3): >>><<< 11389 1726854876.12403: stderr chunk (state=3): >>><<< 11389 1726854876.12405: done transferring module to remote 11389 1726854876.12407: _low_level_execute_command(): starting 11389 1726854876.12409: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854876.0244498-12797-34869013605290/ /root/.ansible/tmp/ansible-tmp-1726854876.0244498-12797-34869013605290/AnsiballZ_command.py && sleep 0' 11389 1726854876.13858: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854876.13921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854876.14196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854876.14214: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854876.14251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854876.14307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854876.16142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854876.16157: stdout chunk (state=3): >>><<< 11389 1726854876.16174: stderr chunk (state=3): >>><<< 11389 1726854876.16490: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854876.16498: _low_level_execute_command(): starting 11389 1726854876.16501: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854876.0244498-12797-34869013605290/AnsiballZ_command.py && sleep 0' 11389 1726854876.17508: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854876.17524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854876.17539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854876.17704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854876.17965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854876.18026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854876.34098: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 13:54:36.332849", "end": "2024-09-20 13:54:36.339961", "delta": "0:00:00.007112", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11389 1726854876.35595: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. <<< 11389 1726854876.35606: stdout chunk (state=3): >>><<< 11389 1726854876.35617: stderr chunk (state=3): >>><<< 11389 1726854876.35638: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 13:54:36.332849", "end": "2024-09-20 13:54:36.339961", "delta": "0:00:00.007112", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.244 closed. 11389 1726854876.36026: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854876.0244498-12797-34869013605290/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854876.36030: _low_level_execute_command(): starting 11389 1726854876.36033: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854876.0244498-12797-34869013605290/ > /dev/null 2>&1 && sleep 0' 11389 1726854876.36984: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854876.37104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854876.37191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854876.37204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854876.37353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854876.39385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854876.39391: stdout chunk (state=3): >>><<< 11389 1726854876.39397: stderr chunk (state=3): >>><<< 11389 1726854876.39462: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854876.39472: handler run complete 11389 1726854876.39540: Evaluated conditional (False): False 11389 1726854876.39550: Evaluated conditional (False): False 11389 1726854876.39637: attempt loop complete, returning result 11389 1726854876.39641: _execute() done 11389 1726854876.39644: dumping result to json 11389 1726854876.39649: done dumping result, returning 11389 1726854876.39659: done running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' [0affcc66-ac2b-deb8-c119-0000000000c1] 11389 1726854876.39672: sending task result for task 0affcc66-ac2b-deb8-c119-0000000000c1 ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007112", "end": "2024-09-20 13:54:36.339961", "failed_when_result": false, "rc": 1, "start": "2024-09-20 13:54:36.332849" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 11389 1726854876.40038: no more pending results, returning what we have 11389 1726854876.40041: results queue empty 11389 1726854876.40042: checking for any_errors_fatal 11389 1726854876.40045: done checking for any_errors_fatal 11389 1726854876.40045: checking for max_fail_percentage 11389 1726854876.40048: done checking for max_fail_percentage 11389 1726854876.40049: checking to see if all hosts have failed and the running result is not ok 11389 1726854876.40050: done checking to see if all hosts have failed 11389 1726854876.40051: getting the remaining hosts for this loop 11389 1726854876.40052: done getting the remaining hosts for this loop 11389 1726854876.40056: getting the next task for host managed_node3 11389 1726854876.40065: done getting next task for host managed_node3 11389 1726854876.40071: ^ task is: TASK: Remove test interfaces 11389 1726854876.40074: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854876.40080: getting variables 11389 1726854876.40082: in VariableManager get_vars() 11389 1726854876.40126: Calling all_inventory to load vars for managed_node3 11389 1726854876.40129: Calling groups_inventory to load vars for managed_node3 11389 1726854876.40131: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854876.40144: Calling all_plugins_play to load vars for managed_node3 11389 1726854876.40147: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854876.40151: Calling groups_plugins_play to load vars for managed_node3 11389 1726854876.40893: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000000c1 11389 1726854876.40896: WORKER PROCESS EXITING 11389 1726854876.43756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854876.47173: done with get_vars() 11389 1726854876.47315: done getting variables 11389 1726854876.47382: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 13:54:36 -0400 (0:00:00.526) 0:00:28.898 ****** 11389 1726854876.47551: entering _queue_task() for managed_node3/shell 11389 1726854876.48271: worker is 1 (out of 1 available) 11389 1726854876.48282: exiting _queue_task() for managed_node3/shell 11389 1726854876.48390: done queuing things up, now waiting for results queue to drain 11389 1726854876.48393: waiting for pending results... 11389 1726854876.49004: running TaskExecutor() for managed_node3/TASK: Remove test interfaces 11389 1726854876.49397: in run() - task 0affcc66-ac2b-deb8-c119-0000000000c5 11389 1726854876.49402: variable 'ansible_search_path' from source: unknown 11389 1726854876.49405: variable 'ansible_search_path' from source: unknown 11389 1726854876.49408: calling self._execute() 11389 1726854876.49411: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854876.49414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854876.49416: variable 'omit' from source: magic vars 11389 1726854876.50395: variable 'ansible_distribution_major_version' from source: facts 11389 1726854876.50415: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854876.50428: variable 'omit' from source: magic vars 11389 1726854876.50495: variable 'omit' from source: magic vars 11389 1726854876.50864: variable 'dhcp_interface1' from source: play vars 11389 1726854876.50876: variable 'dhcp_interface2' from source: play vars 11389 1726854876.50903: variable 'omit' from source: magic vars 11389 1726854876.50950: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854876.50992: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854876.51021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854876.51392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854876.51396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854876.51399: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854876.51401: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854876.51403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854876.51405: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854876.51407: Set connection var ansible_timeout to 10 11389 1726854876.51409: Set connection var ansible_connection to ssh 11389 1726854876.51412: Set connection var ansible_shell_type to sh 11389 1726854876.51414: Set connection var ansible_pipelining to False 11389 1726854876.51416: Set connection var ansible_shell_executable to /bin/sh 11389 1726854876.51515: variable 'ansible_shell_executable' from source: unknown 11389 1726854876.51519: variable 'ansible_connection' from source: unknown 11389 1726854876.51522: variable 'ansible_module_compression' from source: unknown 11389 1726854876.51525: variable 'ansible_shell_type' from source: unknown 11389 1726854876.51527: variable 'ansible_shell_executable' from source: unknown 11389 1726854876.51529: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854876.51531: variable 'ansible_pipelining' from source: unknown 11389 1726854876.51536: variable 'ansible_timeout' from source: unknown 11389 1726854876.51541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854876.51907: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854876.51918: variable 'omit' from source: magic vars 11389 1726854876.51923: starting attempt loop 11389 1726854876.51926: running the handler 11389 1726854876.51937: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854876.51956: _low_level_execute_command(): starting 11389 1726854876.51964: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854876.53436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854876.53456: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854876.53758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854876.53807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854876.55502: stdout chunk (state=3): >>>/root <<< 11389 1726854876.55616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854876.55652: stderr chunk (state=3): >>><<< 11389 1726854876.55662: stdout chunk (state=3): >>><<< 11389 1726854876.55699: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854876.55810: _low_level_execute_command(): starting 11389 1726854876.55824: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854876.5579531-12815-123091445502774 `" && echo ansible-tmp-1726854876.5579531-12815-123091445502774="` echo /root/.ansible/tmp/ansible-tmp-1726854876.5579531-12815-123091445502774 `" ) && sleep 0' 11389 1726854876.56930: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854876.56952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854876.56976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854876.57208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854876.57221: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854876.57298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854876.59202: stdout chunk (state=3): >>>ansible-tmp-1726854876.5579531-12815-123091445502774=/root/.ansible/tmp/ansible-tmp-1726854876.5579531-12815-123091445502774 <<< 11389 1726854876.59434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854876.59581: stderr chunk (state=3): >>><<< 11389 1726854876.59584: stdout chunk (state=3): >>><<< 11389 1726854876.59605: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854876.5579531-12815-123091445502774=/root/.ansible/tmp/ansible-tmp-1726854876.5579531-12815-123091445502774 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854876.59646: variable 'ansible_module_compression' from source: unknown 11389 1726854876.59718: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11389 1726854876.59817: variable 'ansible_facts' from source: unknown 11389 1726854876.60094: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854876.5579531-12815-123091445502774/AnsiballZ_command.py 11389 1726854876.60330: Sending initial data 11389 1726854876.60340: Sent initial data (156 bytes) 11389 1726854876.61545: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854876.61560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854876.61603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854876.61825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854876.61904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854876.61917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854876.63481: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854876.63538: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854876.63599: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmptonct94_ /root/.ansible/tmp/ansible-tmp-1726854876.5579531-12815-123091445502774/AnsiballZ_command.py <<< 11389 1726854876.63603: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854876.5579531-12815-123091445502774/AnsiballZ_command.py" <<< 11389 1726854876.63673: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmptonct94_" to remote "/root/.ansible/tmp/ansible-tmp-1726854876.5579531-12815-123091445502774/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854876.5579531-12815-123091445502774/AnsiballZ_command.py" <<< 11389 1726854876.65216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854876.65220: stdout chunk (state=3): >>><<< 11389 1726854876.65223: stderr chunk (state=3): >>><<< 11389 1726854876.65250: done transferring module to remote 11389 1726854876.65259: _low_level_execute_command(): starting 11389 1726854876.65266: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854876.5579531-12815-123091445502774/ /root/.ansible/tmp/ansible-tmp-1726854876.5579531-12815-123091445502774/AnsiballZ_command.py && sleep 0' 11389 1726854876.67098: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854876.67103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854876.67105: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854876.67108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854876.67659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854876.69544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854876.69606: stderr chunk (state=3): >>><<< 11389 1726854876.69610: stdout chunk (state=3): >>><<< 11389 1726854876.69630: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854876.69634: _low_level_execute_command(): starting 11389 1726854876.69638: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854876.5579531-12815-123091445502774/AnsiballZ_command.py && sleep 0' 11389 1726854876.70894: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854876.70898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854876.70901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854876.70910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854876.70912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854876.70915: stderr chunk (state=3): >>>debug2: match not found <<< 11389 1726854876.70917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854876.71237: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854876.71245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854876.71285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854876.89992: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 13:54:36.865634", "end": "2024-09-20 13:54:36.898812", "delta": "0:00:00.033178", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11389 1726854876.91797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854876.91801: stderr chunk (state=3): >>><<< 11389 1726854876.91804: stdout chunk (state=3): >>><<< 11389 1726854876.91806: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 13:54:36.865634", "end": "2024-09-20 13:54:36.898812", "delta": "0:00:00.033178", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854876.92019: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854876.5579531-12815-123091445502774/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854876.92029: _low_level_execute_command(): starting 11389 1726854876.92032: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854876.5579531-12815-123091445502774/ > /dev/null 2>&1 && sleep 0' 11389 1726854876.93294: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854876.93337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854876.93594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854876.93598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854876.93600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854876.95631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854876.95635: stdout chunk (state=3): >>><<< 11389 1726854876.95637: stderr chunk (state=3): >>><<< 11389 1726854876.95640: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854876.95642: handler run complete 11389 1726854876.95645: Evaluated conditional (False): False 11389 1726854876.95647: attempt loop complete, returning result 11389 1726854876.95697: _execute() done 11389 1726854876.95700: dumping result to json 11389 1726854876.95705: done dumping result, returning 11389 1726854876.95714: done running TaskExecutor() for managed_node3/TASK: Remove test interfaces [0affcc66-ac2b-deb8-c119-0000000000c5] 11389 1726854876.95892: sending task result for task 0affcc66-ac2b-deb8-c119-0000000000c5 ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.033178", "end": "2024-09-20 13:54:36.898812", "rc": 0, "start": "2024-09-20 13:54:36.865634" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 11389 1726854876.96037: no more pending results, returning what we have 11389 1726854876.96041: results queue empty 11389 1726854876.96041: checking for any_errors_fatal 11389 1726854876.96052: done checking for any_errors_fatal 11389 1726854876.96053: checking for max_fail_percentage 11389 1726854876.96055: done checking for max_fail_percentage 11389 1726854876.96056: checking to see if all hosts have failed and the running result is not ok 11389 1726854876.96057: done checking to see if all hosts have failed 11389 1726854876.96058: getting the remaining hosts for this loop 11389 1726854876.96059: done getting the remaining hosts for this loop 11389 1726854876.96062: getting the next task for host managed_node3 11389 1726854876.96073: done getting next task for host managed_node3 11389 1726854876.96076: ^ task is: TASK: Stop dnsmasq/radvd services 11389 1726854876.96079: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854876.96084: getting variables 11389 1726854876.96086: in VariableManager get_vars() 11389 1726854876.96128: Calling all_inventory to load vars for managed_node3 11389 1726854876.96130: Calling groups_inventory to load vars for managed_node3 11389 1726854876.96133: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854876.96143: Calling all_plugins_play to load vars for managed_node3 11389 1726854876.96146: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854876.96148: Calling groups_plugins_play to load vars for managed_node3 11389 1726854876.97094: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000000c5 11389 1726854876.97098: WORKER PROCESS EXITING 11389 1726854876.99847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854877.03282: done with get_vars() 11389 1726854877.03353: done getting variables 11389 1726854877.03416: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 13:54:37 -0400 (0:00:00.560) 0:00:29.458 ****** 11389 1726854877.03568: entering _queue_task() for managed_node3/shell 11389 1726854877.04354: worker is 1 (out of 1 available) 11389 1726854877.04368: exiting _queue_task() for managed_node3/shell 11389 1726854877.04378: done queuing things up, now waiting for results queue to drain 11389 1726854877.04380: waiting for pending results... 11389 1726854877.05308: running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services 11389 1726854877.05585: in run() - task 0affcc66-ac2b-deb8-c119-0000000000c6 11389 1726854877.05638: variable 'ansible_search_path' from source: unknown 11389 1726854877.05793: variable 'ansible_search_path' from source: unknown 11389 1726854877.05797: calling self._execute() 11389 1726854877.06076: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854877.06091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854877.06110: variable 'omit' from source: magic vars 11389 1726854877.07095: variable 'ansible_distribution_major_version' from source: facts 11389 1726854877.07099: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854877.07101: variable 'omit' from source: magic vars 11389 1726854877.07263: variable 'omit' from source: magic vars 11389 1726854877.07320: variable 'omit' from source: magic vars 11389 1726854877.07693: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854877.07697: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854877.07700: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854877.07702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854877.07802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854877.07844: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854877.07938: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854877.07942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854877.08104: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854877.08170: Set connection var ansible_timeout to 10 11389 1726854877.08198: Set connection var ansible_connection to ssh 11389 1726854877.08210: Set connection var ansible_shell_type to sh 11389 1726854877.08220: Set connection var ansible_pipelining to False 11389 1726854877.08242: Set connection var ansible_shell_executable to /bin/sh 11389 1726854877.08454: variable 'ansible_shell_executable' from source: unknown 11389 1726854877.08458: variable 'ansible_connection' from source: unknown 11389 1726854877.08461: variable 'ansible_module_compression' from source: unknown 11389 1726854877.08463: variable 'ansible_shell_type' from source: unknown 11389 1726854877.08468: variable 'ansible_shell_executable' from source: unknown 11389 1726854877.08471: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854877.08473: variable 'ansible_pipelining' from source: unknown 11389 1726854877.08475: variable 'ansible_timeout' from source: unknown 11389 1726854877.08479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854877.08723: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854877.08742: variable 'omit' from source: magic vars 11389 1726854877.08753: starting attempt loop 11389 1726854877.08810: running the handler 11389 1726854877.08814: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854877.08992: _low_level_execute_command(): starting 11389 1726854877.08997: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854877.10346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854877.10372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854877.12025: stdout chunk (state=3): >>>/root <<< 11389 1726854877.12293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854877.12297: stdout chunk (state=3): >>><<< 11389 1726854877.12300: stderr chunk (state=3): >>><<< 11389 1726854877.12304: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854877.12307: _low_level_execute_command(): starting 11389 1726854877.12309: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854877.1224449-12831-31020026546392 `" && echo ansible-tmp-1726854877.1224449-12831-31020026546392="` echo /root/.ansible/tmp/ansible-tmp-1726854877.1224449-12831-31020026546392 `" ) && sleep 0' 11389 1726854877.13280: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854877.13284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854877.13296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854877.13318: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854877.13356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854877.13373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854877.13443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854877.15492: stdout chunk (state=3): >>>ansible-tmp-1726854877.1224449-12831-31020026546392=/root/.ansible/tmp/ansible-tmp-1726854877.1224449-12831-31020026546392 <<< 11389 1726854877.15505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854877.15508: stdout chunk (state=3): >>><<< 11389 1726854877.15510: stderr chunk (state=3): >>><<< 11389 1726854877.15513: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854877.1224449-12831-31020026546392=/root/.ansible/tmp/ansible-tmp-1726854877.1224449-12831-31020026546392 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854877.15546: variable 'ansible_module_compression' from source: unknown 11389 1726854877.15600: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11389 1726854877.15635: variable 'ansible_facts' from source: unknown 11389 1726854877.15907: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854877.1224449-12831-31020026546392/AnsiballZ_command.py 11389 1726854877.16485: Sending initial data 11389 1726854877.16491: Sent initial data (155 bytes) 11389 1726854877.17412: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854877.17416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854877.17437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854877.17443: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration <<< 11389 1726854877.17473: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854877.17478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854877.17657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854877.17660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854877.17679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854877.17683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854877.17803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854877.19368: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854877.19395: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854877.19582: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmpzru3b8wq /root/.ansible/tmp/ansible-tmp-1726854877.1224449-12831-31020026546392/AnsiballZ_command.py <<< 11389 1726854877.19586: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854877.1224449-12831-31020026546392/AnsiballZ_command.py" <<< 11389 1726854877.19646: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmpzru3b8wq" to remote "/root/.ansible/tmp/ansible-tmp-1726854877.1224449-12831-31020026546392/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854877.1224449-12831-31020026546392/AnsiballZ_command.py" <<< 11389 1726854877.20932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854877.20959: stderr chunk (state=3): >>><<< 11389 1726854877.20963: stdout chunk (state=3): >>><<< 11389 1726854877.20995: done transferring module to remote 11389 1726854877.21003: _low_level_execute_command(): starting 11389 1726854877.21009: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854877.1224449-12831-31020026546392/ /root/.ansible/tmp/ansible-tmp-1726854877.1224449-12831-31020026546392/AnsiballZ_command.py && sleep 0' 11389 1726854877.22230: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854877.22234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854877.22507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854877.22700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854877.24522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854877.24564: stderr chunk (state=3): >>><<< 11389 1726854877.24568: stdout chunk (state=3): >>><<< 11389 1726854877.24595: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854877.24599: _low_level_execute_command(): starting 11389 1726854877.24603: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854877.1224449-12831-31020026546392/AnsiballZ_command.py && sleep 0' 11389 1726854877.25718: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854877.25723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854877.25970: stderr chunk (state=3): >>>debug2: match not found <<< 11389 1726854877.25974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854877.25977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 11389 1726854877.25979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854877.26093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854877.26202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854877.44127: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 13:54:37.411731", "end": "2024-09-20 13:54:37.438136", "delta": "0:00:00.026405", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11389 1726854877.45540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854877.45543: stdout chunk (state=3): >>><<< 11389 1726854877.45545: stderr chunk (state=3): >>><<< 11389 1726854877.45562: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 13:54:37.411731", "end": "2024-09-20 13:54:37.438136", "delta": "0:00:00.026405", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854877.46021: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854877.1224449-12831-31020026546392/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854877.46025: _low_level_execute_command(): starting 11389 1726854877.46028: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854877.1224449-12831-31020026546392/ > /dev/null 2>&1 && sleep 0' 11389 1726854877.47161: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854877.47393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854877.47397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854877.47399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854877.47402: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854877.47593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854877.47597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854877.47659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854877.49473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854877.49560: stderr chunk (state=3): >>><<< 11389 1726854877.49563: stdout chunk (state=3): >>><<< 11389 1726854877.49648: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854877.49654: handler run complete 11389 1726854877.49682: Evaluated conditional (False): False 11389 1726854877.49744: attempt loop complete, returning result 11389 1726854877.49748: _execute() done 11389 1726854877.49750: dumping result to json 11389 1726854877.49758: done dumping result, returning 11389 1726854877.49767: done running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services [0affcc66-ac2b-deb8-c119-0000000000c6] 11389 1726854877.49775: sending task result for task 0affcc66-ac2b-deb8-c119-0000000000c6 11389 1726854877.49885: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000000c6 ok: [managed_node3] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.026405", "end": "2024-09-20 13:54:37.438136", "rc": 0, "start": "2024-09-20 13:54:37.411731" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 11389 1726854877.49959: no more pending results, returning what we have 11389 1726854877.49963: results queue empty 11389 1726854877.49964: checking for any_errors_fatal 11389 1726854877.49975: done checking for any_errors_fatal 11389 1726854877.49976: checking for max_fail_percentage 11389 1726854877.49979: done checking for max_fail_percentage 11389 1726854877.49980: checking to see if all hosts have failed and the running result is not ok 11389 1726854877.49981: done checking to see if all hosts have failed 11389 1726854877.49981: getting the remaining hosts for this loop 11389 1726854877.49983: done getting the remaining hosts for this loop 11389 1726854877.49995: getting the next task for host managed_node3 11389 1726854877.50006: done getting next task for host managed_node3 11389 1726854877.50008: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 11389 1726854877.50012: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854877.50016: getting variables 11389 1726854877.50018: in VariableManager get_vars() 11389 1726854877.50061: Calling all_inventory to load vars for managed_node3 11389 1726854877.50064: Calling groups_inventory to load vars for managed_node3 11389 1726854877.50069: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854877.50076: WORKER PROCESS EXITING 11389 1726854877.50331: Calling all_plugins_play to load vars for managed_node3 11389 1726854877.50335: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854877.50344: Calling groups_plugins_play to load vars for managed_node3 11389 1726854877.54025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854877.58019: done with get_vars() 11389 1726854877.58048: done getting variables 11389 1726854877.58216: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:120 Friday 20 September 2024 13:54:37 -0400 (0:00:00.546) 0:00:30.005 ****** 11389 1726854877.58248: entering _queue_task() for managed_node3/command 11389 1726854877.59227: worker is 1 (out of 1 available) 11389 1726854877.59238: exiting _queue_task() for managed_node3/command 11389 1726854877.59250: done queuing things up, now waiting for results queue to drain 11389 1726854877.59251: waiting for pending results... 11389 1726854877.59491: running TaskExecutor() for managed_node3/TASK: Restore the /etc/resolv.conf for initscript 11389 1726854877.59940: in run() - task 0affcc66-ac2b-deb8-c119-0000000000c7 11389 1726854877.59944: variable 'ansible_search_path' from source: unknown 11389 1726854877.59947: calling self._execute() 11389 1726854877.60110: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854877.60121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854877.60137: variable 'omit' from source: magic vars 11389 1726854877.60951: variable 'ansible_distribution_major_version' from source: facts 11389 1726854877.61076: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854877.61257: variable 'network_provider' from source: set_fact 11389 1726854877.61270: Evaluated conditional (network_provider == "initscripts"): False 11389 1726854877.61277: when evaluation is False, skipping this task 11389 1726854877.61282: _execute() done 11389 1726854877.61403: dumping result to json 11389 1726854877.61407: done dumping result, returning 11389 1726854877.61410: done running TaskExecutor() for managed_node3/TASK: Restore the /etc/resolv.conf for initscript [0affcc66-ac2b-deb8-c119-0000000000c7] 11389 1726854877.61412: sending task result for task 0affcc66-ac2b-deb8-c119-0000000000c7 11389 1726854877.61650: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000000c7 11389 1726854877.61654: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11389 1726854877.61707: no more pending results, returning what we have 11389 1726854877.61710: results queue empty 11389 1726854877.61711: checking for any_errors_fatal 11389 1726854877.61722: done checking for any_errors_fatal 11389 1726854877.61722: checking for max_fail_percentage 11389 1726854877.61725: done checking for max_fail_percentage 11389 1726854877.61726: checking to see if all hosts have failed and the running result is not ok 11389 1726854877.61727: done checking to see if all hosts have failed 11389 1726854877.61728: getting the remaining hosts for this loop 11389 1726854877.61730: done getting the remaining hosts for this loop 11389 1726854877.61733: getting the next task for host managed_node3 11389 1726854877.61742: done getting next task for host managed_node3 11389 1726854877.61745: ^ task is: TASK: Verify network state restored to default 11389 1726854877.61748: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854877.61753: getting variables 11389 1726854877.61754: in VariableManager get_vars() 11389 1726854877.61801: Calling all_inventory to load vars for managed_node3 11389 1726854877.61803: Calling groups_inventory to load vars for managed_node3 11389 1726854877.61806: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854877.61818: Calling all_plugins_play to load vars for managed_node3 11389 1726854877.61821: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854877.61823: Calling groups_plugins_play to load vars for managed_node3 11389 1726854877.65242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854877.67470: done with get_vars() 11389 1726854877.67501: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:125 Friday 20 September 2024 13:54:37 -0400 (0:00:00.093) 0:00:30.099 ****** 11389 1726854877.67604: entering _queue_task() for managed_node3/include_tasks 11389 1726854877.67967: worker is 1 (out of 1 available) 11389 1726854877.67980: exiting _queue_task() for managed_node3/include_tasks 11389 1726854877.67995: done queuing things up, now waiting for results queue to drain 11389 1726854877.67996: waiting for pending results... 11389 1726854877.68645: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 11389 1726854877.68651: in run() - task 0affcc66-ac2b-deb8-c119-0000000000c8 11389 1726854877.68662: variable 'ansible_search_path' from source: unknown 11389 1726854877.68665: calling self._execute() 11389 1726854877.68754: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854877.68758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854877.68772: variable 'omit' from source: magic vars 11389 1726854877.69704: variable 'ansible_distribution_major_version' from source: facts 11389 1726854877.69709: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854877.69712: _execute() done 11389 1726854877.69724: dumping result to json 11389 1726854877.69727: done dumping result, returning 11389 1726854877.69730: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [0affcc66-ac2b-deb8-c119-0000000000c8] 11389 1726854877.69732: sending task result for task 0affcc66-ac2b-deb8-c119-0000000000c8 11389 1726854877.70308: no more pending results, returning what we have 11389 1726854877.70314: in VariableManager get_vars() 11389 1726854877.70369: Calling all_inventory to load vars for managed_node3 11389 1726854877.70372: Calling groups_inventory to load vars for managed_node3 11389 1726854877.70375: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854877.70398: Calling all_plugins_play to load vars for managed_node3 11389 1726854877.70402: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854877.70407: Calling groups_plugins_play to load vars for managed_node3 11389 1726854877.71613: done sending task result for task 0affcc66-ac2b-deb8-c119-0000000000c8 11389 1726854877.71617: WORKER PROCESS EXITING 11389 1726854877.74578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854877.77343: done with get_vars() 11389 1726854877.77370: variable 'ansible_search_path' from source: unknown 11389 1726854877.77754: we have included files to process 11389 1726854877.77756: generating all_blocks data 11389 1726854877.77758: done generating all_blocks data 11389 1726854877.77765: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11389 1726854877.77766: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11389 1726854877.77769: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11389 1726854877.78315: done processing included file 11389 1726854877.78317: iterating over new_blocks loaded from include file 11389 1726854877.78319: in VariableManager get_vars() 11389 1726854877.78339: done with get_vars() 11389 1726854877.78341: filtering new block on tags 11389 1726854877.78380: done filtering new block on tags 11389 1726854877.78383: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 11389 1726854877.78392: extending task lists for all hosts with included blocks 11389 1726854877.79810: done extending task lists 11389 1726854877.79812: done processing included files 11389 1726854877.79813: results queue empty 11389 1726854877.79814: checking for any_errors_fatal 11389 1726854877.79821: done checking for any_errors_fatal 11389 1726854877.79822: checking for max_fail_percentage 11389 1726854877.79824: done checking for max_fail_percentage 11389 1726854877.79825: checking to see if all hosts have failed and the running result is not ok 11389 1726854877.79826: done checking to see if all hosts have failed 11389 1726854877.79826: getting the remaining hosts for this loop 11389 1726854877.79827: done getting the remaining hosts for this loop 11389 1726854877.79830: getting the next task for host managed_node3 11389 1726854877.79834: done getting next task for host managed_node3 11389 1726854877.79837: ^ task is: TASK: Check routes and DNS 11389 1726854877.79839: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854877.79842: getting variables 11389 1726854877.79843: in VariableManager get_vars() 11389 1726854877.79862: Calling all_inventory to load vars for managed_node3 11389 1726854877.79864: Calling groups_inventory to load vars for managed_node3 11389 1726854877.79866: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854877.79872: Calling all_plugins_play to load vars for managed_node3 11389 1726854877.79874: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854877.79877: Calling groups_plugins_play to load vars for managed_node3 11389 1726854877.81159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854877.82705: done with get_vars() 11389 1726854877.82741: done getting variables 11389 1726854877.82824: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 13:54:37 -0400 (0:00:00.152) 0:00:30.251 ****** 11389 1726854877.82860: entering _queue_task() for managed_node3/shell 11389 1726854877.83313: worker is 1 (out of 1 available) 11389 1726854877.83327: exiting _queue_task() for managed_node3/shell 11389 1726854877.83339: done queuing things up, now waiting for results queue to drain 11389 1726854877.83341: waiting for pending results... 11389 1726854877.83657: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 11389 1726854877.83777: in run() - task 0affcc66-ac2b-deb8-c119-00000000056d 11389 1726854877.83797: variable 'ansible_search_path' from source: unknown 11389 1726854877.83800: variable 'ansible_search_path' from source: unknown 11389 1726854877.83846: calling self._execute() 11389 1726854877.83973: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854877.83977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854877.83980: variable 'omit' from source: magic vars 11389 1726854877.84391: variable 'ansible_distribution_major_version' from source: facts 11389 1726854877.84395: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854877.84402: variable 'omit' from source: magic vars 11389 1726854877.84523: variable 'omit' from source: magic vars 11389 1726854877.84527: variable 'omit' from source: magic vars 11389 1726854877.84529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854877.84591: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854877.84627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854877.84656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854877.84659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854877.84735: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854877.84739: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854877.84749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854877.84883: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854877.84889: Set connection var ansible_timeout to 10 11389 1726854877.84892: Set connection var ansible_connection to ssh 11389 1726854877.84894: Set connection var ansible_shell_type to sh 11389 1726854877.84896: Set connection var ansible_pipelining to False 11389 1726854877.84955: Set connection var ansible_shell_executable to /bin/sh 11389 1726854877.84959: variable 'ansible_shell_executable' from source: unknown 11389 1726854877.84961: variable 'ansible_connection' from source: unknown 11389 1726854877.84965: variable 'ansible_module_compression' from source: unknown 11389 1726854877.84969: variable 'ansible_shell_type' from source: unknown 11389 1726854877.84972: variable 'ansible_shell_executable' from source: unknown 11389 1726854877.84974: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854877.84976: variable 'ansible_pipelining' from source: unknown 11389 1726854877.84978: variable 'ansible_timeout' from source: unknown 11389 1726854877.84980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854877.85457: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854877.85461: variable 'omit' from source: magic vars 11389 1726854877.85463: starting attempt loop 11389 1726854877.85465: running the handler 11389 1726854877.85470: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854877.85473: _low_level_execute_command(): starting 11389 1726854877.85475: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854877.86139: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854877.86146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854877.86149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854877.86151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854877.86215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854877.86393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854877.86397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854877.86399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854877.86565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854877.88199: stdout chunk (state=3): >>>/root <<< 11389 1726854877.88498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854877.88502: stdout chunk (state=3): >>><<< 11389 1726854877.88504: stderr chunk (state=3): >>><<< 11389 1726854877.88508: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854877.88511: _low_level_execute_command(): starting 11389 1726854877.88514: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854877.883846-12875-71283764493821 `" && echo ansible-tmp-1726854877.883846-12875-71283764493821="` echo /root/.ansible/tmp/ansible-tmp-1726854877.883846-12875-71283764493821 `" ) && sleep 0' 11389 1726854877.89048: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854877.89061: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854877.89103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854877.89109: stderr chunk (state=3): >>>debug2: match found <<< 11389 1726854877.89121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854877.89199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854877.89222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854877.89311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854877.91214: stdout chunk (state=3): >>>ansible-tmp-1726854877.883846-12875-71283764493821=/root/.ansible/tmp/ansible-tmp-1726854877.883846-12875-71283764493821 <<< 11389 1726854877.91411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854877.91434: stdout chunk (state=3): >>><<< 11389 1726854877.91457: stderr chunk (state=3): >>><<< 11389 1726854877.91642: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854877.883846-12875-71283764493821=/root/.ansible/tmp/ansible-tmp-1726854877.883846-12875-71283764493821 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854877.91651: variable 'ansible_module_compression' from source: unknown 11389 1726854877.91655: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11389 1726854877.91714: variable 'ansible_facts' from source: unknown 11389 1726854877.91850: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854877.883846-12875-71283764493821/AnsiballZ_command.py 11389 1726854877.92286: Sending initial data 11389 1726854877.92298: Sent initial data (154 bytes) 11389 1726854877.93110: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854877.93179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854877.93226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854877.93247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854877.93290: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854877.93351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854877.94930: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854877.95004: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854877.95064: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmp5ijy7z7h /root/.ansible/tmp/ansible-tmp-1726854877.883846-12875-71283764493821/AnsiballZ_command.py <<< 11389 1726854877.95070: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854877.883846-12875-71283764493821/AnsiballZ_command.py" <<< 11389 1726854877.95195: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmp5ijy7z7h" to remote "/root/.ansible/tmp/ansible-tmp-1726854877.883846-12875-71283764493821/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854877.883846-12875-71283764493821/AnsiballZ_command.py" <<< 11389 1726854877.96258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854877.96379: stdout chunk (state=3): >>><<< 11389 1726854877.96382: stderr chunk (state=3): >>><<< 11389 1726854877.96384: done transferring module to remote 11389 1726854877.96386: _low_level_execute_command(): starting 11389 1726854877.96486: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854877.883846-12875-71283764493821/ /root/.ansible/tmp/ansible-tmp-1726854877.883846-12875-71283764493821/AnsiballZ_command.py && sleep 0' 11389 1726854877.97370: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854877.97404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854877.97443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854877.97549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854877.97663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854877.97714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854877.98003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854877.99817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854877.99821: stdout chunk (state=3): >>><<< 11389 1726854877.99823: stderr chunk (state=3): >>><<< 11389 1726854877.99847: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854877.99859: _low_level_execute_command(): starting 11389 1726854877.99875: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854877.883846-12875-71283764493821/AnsiballZ_command.py && sleep 0' 11389 1726854878.00526: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854878.00553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854878.00572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854878.00599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854878.00618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 <<< 11389 1726854878.00660: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found <<< 11389 1726854878.00735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854878.00762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854878.00863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854878.16939: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:88:11:da:7f:a3 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.244/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2974sec preferred_lft 2974sec\n inet6 fe80::1088:11ff:feda:7fa3/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.244 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.244 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:54:38.159706", "end": "2024-09-20 13:54:38.168230", "delta": "0:00:00.008524", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11389 1726854878.18492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854878.18496: stderr chunk (state=3): >>><<< 11389 1726854878.18506: stdout chunk (state=3): >>><<< 11389 1726854878.18527: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:88:11:da:7f:a3 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.244/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2974sec preferred_lft 2974sec\n inet6 fe80::1088:11ff:feda:7fa3/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.244 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.244 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:54:38.159706", "end": "2024-09-20 13:54:38.168230", "delta": "0:00:00.008524", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854878.18567: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854877.883846-12875-71283764493821/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854878.18573: _low_level_execute_command(): starting 11389 1726854878.18579: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854877.883846-12875-71283764493821/ > /dev/null 2>&1 && sleep 0' 11389 1726854878.19035: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854878.19038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854878.19041: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address <<< 11389 1726854878.19044: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854878.19091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854878.19094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854878.19161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854878.21018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854878.21022: stdout chunk (state=3): >>><<< 11389 1726854878.21025: stderr chunk (state=3): >>><<< 11389 1726854878.21027: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854878.21029: handler run complete 11389 1726854878.21145: Evaluated conditional (False): False 11389 1726854878.21148: attempt loop complete, returning result 11389 1726854878.21150: _execute() done 11389 1726854878.21152: dumping result to json 11389 1726854878.21154: done dumping result, returning 11389 1726854878.21157: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [0affcc66-ac2b-deb8-c119-00000000056d] 11389 1726854878.21159: sending task result for task 0affcc66-ac2b-deb8-c119-00000000056d 11389 1726854878.21233: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000056d 11389 1726854878.21236: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008524", "end": "2024-09-20 13:54:38.168230", "rc": 0, "start": "2024-09-20 13:54:38.159706" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:88:11:da:7f:a3 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.244/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2974sec preferred_lft 2974sec inet6 fe80::1088:11ff:feda:7fa3/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.244 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.244 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 11389 1726854878.21375: no more pending results, returning what we have 11389 1726854878.21379: results queue empty 11389 1726854878.21380: checking for any_errors_fatal 11389 1726854878.21382: done checking for any_errors_fatal 11389 1726854878.21383: checking for max_fail_percentage 11389 1726854878.21385: done checking for max_fail_percentage 11389 1726854878.21386: checking to see if all hosts have failed and the running result is not ok 11389 1726854878.21494: done checking to see if all hosts have failed 11389 1726854878.21496: getting the remaining hosts for this loop 11389 1726854878.21499: done getting the remaining hosts for this loop 11389 1726854878.21503: getting the next task for host managed_node3 11389 1726854878.21509: done getting next task for host managed_node3 11389 1726854878.21512: ^ task is: TASK: Verify DNS and network connectivity 11389 1726854878.21515: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11389 1726854878.21523: getting variables 11389 1726854878.21525: in VariableManager get_vars() 11389 1726854878.21569: Calling all_inventory to load vars for managed_node3 11389 1726854878.21572: Calling groups_inventory to load vars for managed_node3 11389 1726854878.21574: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854878.21586: Calling all_plugins_play to load vars for managed_node3 11389 1726854878.21694: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854878.21705: Calling groups_plugins_play to load vars for managed_node3 11389 1726854878.22580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854878.23454: done with get_vars() 11389 1726854878.23472: done getting variables 11389 1726854878.23516: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 13:54:38 -0400 (0:00:00.406) 0:00:30.658 ****** 11389 1726854878.23540: entering _queue_task() for managed_node3/shell 11389 1726854878.23790: worker is 1 (out of 1 available) 11389 1726854878.23805: exiting _queue_task() for managed_node3/shell 11389 1726854878.23816: done queuing things up, now waiting for results queue to drain 11389 1726854878.23818: waiting for pending results... 11389 1726854878.24019: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 11389 1726854878.24137: in run() - task 0affcc66-ac2b-deb8-c119-00000000056e 11389 1726854878.24144: variable 'ansible_search_path' from source: unknown 11389 1726854878.24148: variable 'ansible_search_path' from source: unknown 11389 1726854878.24183: calling self._execute() 11389 1726854878.24260: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854878.24271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854878.24275: variable 'omit' from source: magic vars 11389 1726854878.24596: variable 'ansible_distribution_major_version' from source: facts 11389 1726854878.24607: Evaluated conditional (ansible_distribution_major_version != '6'): True 11389 1726854878.24772: variable 'ansible_facts' from source: unknown 11389 1726854878.25450: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 11389 1726854878.25458: variable 'omit' from source: magic vars 11389 1726854878.25518: variable 'omit' from source: magic vars 11389 1726854878.25552: variable 'omit' from source: magic vars 11389 1726854878.25592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11389 1726854878.25631: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11389 1726854878.25656: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11389 1726854878.25674: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854878.25680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11389 1726854878.25715: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11389 1726854878.25719: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854878.25722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854878.25821: Set connection var ansible_module_compression to ZIP_DEFLATED 11389 1726854878.25828: Set connection var ansible_timeout to 10 11389 1726854878.25831: Set connection var ansible_connection to ssh 11389 1726854878.25835: Set connection var ansible_shell_type to sh 11389 1726854878.25848: Set connection var ansible_pipelining to False 11389 1726854878.25870: Set connection var ansible_shell_executable to /bin/sh 11389 1726854878.25894: variable 'ansible_shell_executable' from source: unknown 11389 1726854878.25897: variable 'ansible_connection' from source: unknown 11389 1726854878.25900: variable 'ansible_module_compression' from source: unknown 11389 1726854878.25902: variable 'ansible_shell_type' from source: unknown 11389 1726854878.25905: variable 'ansible_shell_executable' from source: unknown 11389 1726854878.25907: variable 'ansible_host' from source: host vars for 'managed_node3' 11389 1726854878.25910: variable 'ansible_pipelining' from source: unknown 11389 1726854878.25912: variable 'ansible_timeout' from source: unknown 11389 1726854878.25917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11389 1726854878.26019: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854878.26028: variable 'omit' from source: magic vars 11389 1726854878.26032: starting attempt loop 11389 1726854878.26035: running the handler 11389 1726854878.26044: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 11389 1726854878.26059: _low_level_execute_command(): starting 11389 1726854878.26066: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11389 1726854878.26550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854878.26589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854878.26594: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854878.26596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854878.26642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854878.26649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854878.26651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854878.26714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854878.28357: stdout chunk (state=3): >>>/root <<< 11389 1726854878.28462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854878.28497: stderr chunk (state=3): >>><<< 11389 1726854878.28500: stdout chunk (state=3): >>><<< 11389 1726854878.28523: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854878.28613: _low_level_execute_command(): starting 11389 1726854878.28617: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726854878.2852962-12897-50867071999282 `" && echo ansible-tmp-1726854878.2852962-12897-50867071999282="` echo /root/.ansible/tmp/ansible-tmp-1726854878.2852962-12897-50867071999282 `" ) && sleep 0' 11389 1726854878.29159: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854878.29176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854878.29194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854878.29250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854878.29314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854878.29336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854878.29434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854878.31345: stdout chunk (state=3): >>>ansible-tmp-1726854878.2852962-12897-50867071999282=/root/.ansible/tmp/ansible-tmp-1726854878.2852962-12897-50867071999282 <<< 11389 1726854878.31508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854878.31512: stdout chunk (state=3): >>><<< 11389 1726854878.31514: stderr chunk (state=3): >>><<< 11389 1726854878.31533: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726854878.2852962-12897-50867071999282=/root/.ansible/tmp/ansible-tmp-1726854878.2852962-12897-50867071999282 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854878.31571: variable 'ansible_module_compression' from source: unknown 11389 1726854878.31644: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11389p20__4u0/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11389 1726854878.31868: variable 'ansible_facts' from source: unknown 11389 1726854878.31872: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726854878.2852962-12897-50867071999282/AnsiballZ_command.py 11389 1726854878.32018: Sending initial data 11389 1726854878.32022: Sent initial data (155 bytes) 11389 1726854878.32565: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854878.32578: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854878.32593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854878.32612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854878.32702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854878.32713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854878.32725: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854878.32744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854878.32832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854878.34416: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11389 1726854878.34453: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11389 1726854878.34502: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11389 1726854878.34593: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11389p20__4u0/tmp7ceh8l9f /root/.ansible/tmp/ansible-tmp-1726854878.2852962-12897-50867071999282/AnsiballZ_command.py <<< 11389 1726854878.34596: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726854878.2852962-12897-50867071999282/AnsiballZ_command.py" <<< 11389 1726854878.34655: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11389p20__4u0/tmp7ceh8l9f" to remote "/root/.ansible/tmp/ansible-tmp-1726854878.2852962-12897-50867071999282/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726854878.2852962-12897-50867071999282/AnsiballZ_command.py" <<< 11389 1726854878.35570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854878.35573: stdout chunk (state=3): >>><<< 11389 1726854878.35576: stderr chunk (state=3): >>><<< 11389 1726854878.35590: done transferring module to remote 11389 1726854878.35607: _low_level_execute_command(): starting 11389 1726854878.35617: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726854878.2852962-12897-50867071999282/ /root/.ansible/tmp/ansible-tmp-1726854878.2852962-12897-50867071999282/AnsiballZ_command.py && sleep 0' 11389 1726854878.36300: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854878.36344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854878.36441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854878.36472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854878.36576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854878.38441: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854878.38444: stdout chunk (state=3): >>><<< 11389 1726854878.38447: stderr chunk (state=3): >>><<< 11389 1726854878.38502: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854878.38506: _low_level_execute_command(): starting 11389 1726854878.38509: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726854878.2852962-12897-50867071999282/AnsiballZ_command.py && sleep 0' 11389 1726854878.39208: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11389 1726854878.39225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11389 1726854878.39258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854878.39276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854878.39304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854878.39370: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854878.39419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854878.39439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854878.39472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854878.39610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854878.82613: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1499 0 --:--:-- --:--:-- --:--:-- 1495\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 5809 0 --:--:-- --:--:-- --:--:-- 5820", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:54:38.546363", "end": "2024-09-20 13:54:38.822643", "delta": "0:00:00.276280", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11389 1726854878.84711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. <<< 11389 1726854878.84715: stdout chunk (state=3): >>><<< 11389 1726854878.84718: stderr chunk (state=3): >>><<< 11389 1726854878.84721: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1499 0 --:--:-- --:--:-- --:--:-- 1495\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 5809 0 --:--:-- --:--:-- --:--:-- 5820", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:54:38.546363", "end": "2024-09-20 13:54:38.822643", "delta": "0:00:00.276280", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.244 closed. 11389 1726854878.84730: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726854878.2852962-12897-50867071999282/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11389 1726854878.84732: _low_level_execute_command(): starting 11389 1726854878.84735: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726854878.2852962-12897-50867071999282/ > /dev/null 2>&1 && sleep 0' 11389 1726854878.86202: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11389 1726854878.86425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11389 1726854878.86439: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11389 1726854878.86451: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11389 1726854878.86511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' <<< 11389 1726854878.86524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11389 1726854878.86609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11389 1726854878.86685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11389 1726854878.88543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11389 1726854878.88693: stderr chunk (state=3): >>><<< 11389 1726854878.88706: stdout chunk (state=3): >>><<< 11389 1726854878.88740: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.244 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.244 originally 10.31.9.244 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/db1ec2560f' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11389 1726854878.88745: handler run complete 11389 1726854878.88831: Evaluated conditional (False): False 11389 1726854878.88844: attempt loop complete, returning result 11389 1726854878.88884: _execute() done 11389 1726854878.88905: dumping result to json 11389 1726854878.89085: done dumping result, returning 11389 1726854878.89095: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [0affcc66-ac2b-deb8-c119-00000000056e] 11389 1726854878.89097: sending task result for task 0affcc66-ac2b-deb8-c119-00000000056e ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.276280", "end": "2024-09-20 13:54:38.822643", "rc": 0, "start": "2024-09-20 13:54:38.546363" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1499 0 --:--:-- --:--:-- --:--:-- 1495 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 5809 0 --:--:-- --:--:-- --:--:-- 5820 11389 1726854878.89540: no more pending results, returning what we have 11389 1726854878.89795: results queue empty 11389 1726854878.89797: checking for any_errors_fatal 11389 1726854878.89807: done checking for any_errors_fatal 11389 1726854878.89808: checking for max_fail_percentage 11389 1726854878.89810: done checking for max_fail_percentage 11389 1726854878.89811: checking to see if all hosts have failed and the running result is not ok 11389 1726854878.89812: done checking to see if all hosts have failed 11389 1726854878.89812: getting the remaining hosts for this loop 11389 1726854878.89814: done getting the remaining hosts for this loop 11389 1726854878.89817: getting the next task for host managed_node3 11389 1726854878.89832: done getting next task for host managed_node3 11389 1726854878.89835: ^ task is: TASK: meta (flush_handlers) 11389 1726854878.89837: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854878.89841: getting variables 11389 1726854878.89843: in VariableManager get_vars() 11389 1726854878.90097: Calling all_inventory to load vars for managed_node3 11389 1726854878.90100: Calling groups_inventory to load vars for managed_node3 11389 1726854878.90104: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854878.90115: Calling all_plugins_play to load vars for managed_node3 11389 1726854878.90118: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854878.90121: Calling groups_plugins_play to load vars for managed_node3 11389 1726854878.90770: done sending task result for task 0affcc66-ac2b-deb8-c119-00000000056e 11389 1726854878.90774: WORKER PROCESS EXITING 11389 1726854878.92285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854878.95905: done with get_vars() 11389 1726854878.95938: done getting variables 11389 1726854878.96230: in VariableManager get_vars() 11389 1726854878.96247: Calling all_inventory to load vars for managed_node3 11389 1726854878.96249: Calling groups_inventory to load vars for managed_node3 11389 1726854878.96251: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854878.96257: Calling all_plugins_play to load vars for managed_node3 11389 1726854878.96259: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854878.96262: Calling groups_plugins_play to load vars for managed_node3 11389 1726854878.98032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854879.00609: done with get_vars() 11389 1726854879.00641: done queuing things up, now waiting for results queue to drain 11389 1726854879.00643: results queue empty 11389 1726854879.00644: checking for any_errors_fatal 11389 1726854879.00721: done checking for any_errors_fatal 11389 1726854879.00722: checking for max_fail_percentage 11389 1726854879.00724: done checking for max_fail_percentage 11389 1726854879.00724: checking to see if all hosts have failed and the running result is not ok 11389 1726854879.00725: done checking to see if all hosts have failed 11389 1726854879.00726: getting the remaining hosts for this loop 11389 1726854879.00727: done getting the remaining hosts for this loop 11389 1726854879.00730: getting the next task for host managed_node3 11389 1726854879.00734: done getting next task for host managed_node3 11389 1726854879.00736: ^ task is: TASK: meta (flush_handlers) 11389 1726854879.00737: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854879.00740: getting variables 11389 1726854879.00741: in VariableManager get_vars() 11389 1726854879.00809: Calling all_inventory to load vars for managed_node3 11389 1726854879.00812: Calling groups_inventory to load vars for managed_node3 11389 1726854879.00814: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854879.00819: Calling all_plugins_play to load vars for managed_node3 11389 1726854879.00822: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854879.00824: Calling groups_plugins_play to load vars for managed_node3 11389 1726854879.02132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854879.03016: done with get_vars() 11389 1726854879.03033: done getting variables 11389 1726854879.03077: in VariableManager get_vars() 11389 1726854879.03089: Calling all_inventory to load vars for managed_node3 11389 1726854879.03091: Calling groups_inventory to load vars for managed_node3 11389 1726854879.03093: Calling all_plugins_inventory to load vars for managed_node3 11389 1726854879.03096: Calling all_plugins_play to load vars for managed_node3 11389 1726854879.03098: Calling groups_plugins_inventory to load vars for managed_node3 11389 1726854879.03099: Calling groups_plugins_play to load vars for managed_node3 11389 1726854879.03872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11389 1726854879.05127: done with get_vars() 11389 1726854879.05164: done queuing things up, now waiting for results queue to drain 11389 1726854879.05168: results queue empty 11389 1726854879.05169: checking for any_errors_fatal 11389 1726854879.05170: done checking for any_errors_fatal 11389 1726854879.05171: checking for max_fail_percentage 11389 1726854879.05172: done checking for max_fail_percentage 11389 1726854879.05174: checking to see if all hosts have failed and the running result is not ok 11389 1726854879.05175: done checking to see if all hosts have failed 11389 1726854879.05176: getting the remaining hosts for this loop 11389 1726854879.05177: done getting the remaining hosts for this loop 11389 1726854879.05180: getting the next task for host managed_node3 11389 1726854879.05183: done getting next task for host managed_node3 11389 1726854879.05184: ^ task is: None 11389 1726854879.05185: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11389 1726854879.05186: done queuing things up, now waiting for results queue to drain 11389 1726854879.05188: results queue empty 11389 1726854879.05189: checking for any_errors_fatal 11389 1726854879.05190: done checking for any_errors_fatal 11389 1726854879.05191: checking for max_fail_percentage 11389 1726854879.05192: done checking for max_fail_percentage 11389 1726854879.05192: checking to see if all hosts have failed and the running result is not ok 11389 1726854879.05193: done checking to see if all hosts have failed 11389 1726854879.05195: getting the next task for host managed_node3 11389 1726854879.05197: done getting next task for host managed_node3 11389 1726854879.05198: ^ task is: None 11389 1726854879.05199: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=76 changed=2 unreachable=0 failed=0 skipped=60 rescued=0 ignored=0 Friday 20 September 2024 13:54:39 -0400 (0:00:00.817) 0:00:31.476 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.00s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.83s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 1.83s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Gathering Facts --------------------------------------------------------- 1.32s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.23s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.99s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Install dnsmasq --------------------------------------------------------- 0.97s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.96s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.93s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 0.90s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 Verify DNS and network connectivity ------------------------------------- 0.82s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Check which packages are installed --- 0.79s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Install pgrep, sysctl --------------------------------------------------- 0.70s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.65s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather the minimum subset of ansible_facts required by the network role test --- 0.64s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.59s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Remove test interfaces -------------------------------------------------- 0.56s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Stop dnsmasq/radvd services --------------------------------------------- 0.55s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Delete the device 'nm-bond' --------------------------------------------- 0.53s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:114 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.52s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 11389 1726854879.05357: RUNNING CLEANUP