[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 34139 1726867640.78161: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Isn executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 34139 1726867640.78580: Added group all to inventory 34139 1726867640.78582: Added group ungrouped to inventory 34139 1726867640.78586: Group all now contains ungrouped 34139 1726867640.78589: Examining possible inventory source: /tmp/network-5rw/inventory.yml 34139 1726867640.94576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 34139 1726867640.94940: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 34139 1726867640.94962: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 34139 1726867640.95015: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 34139 1726867640.95288: Loaded config def from plugin (inventory/script) 34139 1726867640.95290: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 34139 1726867640.95330: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 34139 1726867640.95416: Loaded config def from plugin (inventory/yaml) 34139 1726867640.95418: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 34139 1726867640.95704: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 34139 1726867640.96525: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 34139 1726867640.96529: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 34139 1726867640.96532: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 34139 1726867640.96538: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 34139 1726867640.96542: Loading data from /tmp/network-5rw/inventory.yml 34139 1726867640.96609: /tmp/network-5rw/inventory.yml was not parsable by auto 34139 1726867640.96671: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 34139 1726867640.96917: Loading data from /tmp/network-5rw/inventory.yml 34139 1726867640.96999: group all already in inventory 34139 1726867640.97006: set inventory_file for managed_node1 34139 1726867640.97010: set inventory_dir for managed_node1 34139 1726867640.97011: Added host managed_node1 to inventory 34139 1726867640.97013: Added host managed_node1 to group all 34139 1726867640.97014: set ansible_host for managed_node1 34139 1726867640.97015: set ansible_ssh_extra_args for managed_node1 34139 1726867640.97018: set inventory_file for managed_node2 34139 1726867640.97020: set inventory_dir for managed_node2 34139 1726867640.97021: Added host managed_node2 to inventory 34139 1726867640.97022: Added host managed_node2 to group all 34139 1726867640.97023: set ansible_host for managed_node2 34139 1726867640.97024: set ansible_ssh_extra_args for managed_node2 34139 1726867640.97026: set inventory_file for managed_node3 34139 1726867640.97028: set inventory_dir for managed_node3 34139 1726867640.97029: Added host managed_node3 to inventory 34139 1726867640.97030: Added host managed_node3 to group all 34139 1726867640.97031: set ansible_host for managed_node3 34139 1726867640.97031: set ansible_ssh_extra_args for managed_node3 34139 1726867640.97034: Reconcile groups and hosts in inventory. 34139 1726867640.97037: Group ungrouped now contains managed_node1 34139 1726867640.97039: Group ungrouped now contains managed_node2 34139 1726867640.97040: Group ungrouped now contains managed_node3 34139 1726867640.97318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 34139 1726867640.97624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 34139 1726867640.97698: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 34139 1726867640.97728: Loaded config def from plugin (vars/host_group_vars) 34139 1726867640.97730: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 34139 1726867640.97737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 34139 1726867640.97744: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 34139 1726867640.97795: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 34139 1726867640.98130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867640.98229: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 34139 1726867640.98268: Loaded config def from plugin (connection/local) 34139 1726867640.98270: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 34139 1726867640.98947: Loaded config def from plugin (connection/paramiko_ssh) 34139 1726867640.98950: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 34139 1726867641.00040: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34139 1726867641.00086: Loaded config def from plugin (connection/psrp) 34139 1726867641.00089: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 34139 1726867641.00853: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34139 1726867641.00898: Loaded config def from plugin (connection/ssh) 34139 1726867641.00901: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 34139 1726867641.03244: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34139 1726867641.03358: Loaded config def from plugin (connection/winrm) 34139 1726867641.03361: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 34139 1726867641.03419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 34139 1726867641.03508: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 34139 1726867641.03599: Loaded config def from plugin (shell/cmd) 34139 1726867641.03602: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 34139 1726867641.03637: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 34139 1726867641.03706: Loaded config def from plugin (shell/powershell) 34139 1726867641.03711: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 34139 1726867641.03767: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 34139 1726867641.03958: Loaded config def from plugin (shell/sh) 34139 1726867641.03960: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 34139 1726867641.03995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 34139 1726867641.04125: Loaded config def from plugin (become/runas) 34139 1726867641.04127: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 34139 1726867641.04317: Loaded config def from plugin (become/su) 34139 1726867641.04319: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 34139 1726867641.04476: Loaded config def from plugin (become/sudo) 34139 1726867641.04482: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 34139 1726867641.04515: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml 34139 1726867641.04825: in VariableManager get_vars() 34139 1726867641.04843: done with get_vars() 34139 1726867641.04972: trying /usr/local/lib/python3.12/site-packages/ansible/modules 34139 1726867641.08262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 34139 1726867641.08383: in VariableManager get_vars() 34139 1726867641.08387: done with get_vars() 34139 1726867641.08393: variable 'playbook_dir' from source: magic vars 34139 1726867641.08394: variable 'ansible_playbook_python' from source: magic vars 34139 1726867641.08395: variable 'ansible_config_file' from source: magic vars 34139 1726867641.08396: variable 'groups' from source: magic vars 34139 1726867641.08396: variable 'omit' from source: magic vars 34139 1726867641.08397: variable 'ansible_version' from source: magic vars 34139 1726867641.08397: variable 'ansible_check_mode' from source: magic vars 34139 1726867641.08398: variable 'ansible_diff_mode' from source: magic vars 34139 1726867641.08399: variable 'ansible_forks' from source: magic vars 34139 1726867641.08399: variable 'ansible_inventory_sources' from source: magic vars 34139 1726867641.08400: variable 'ansible_skip_tags' from source: magic vars 34139 1726867641.08400: variable 'ansible_limit' from source: magic vars 34139 1726867641.08401: variable 'ansible_run_tags' from source: magic vars 34139 1726867641.08402: variable 'ansible_verbosity' from source: magic vars 34139 1726867641.08432: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml 34139 1726867641.08996: in VariableManager get_vars() 34139 1726867641.09013: done with get_vars() 34139 1726867641.09140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 34139 1726867641.09338: in VariableManager get_vars() 34139 1726867641.09350: done with get_vars() 34139 1726867641.09355: variable 'omit' from source: magic vars 34139 1726867641.09379: variable 'omit' from source: magic vars 34139 1726867641.09414: in VariableManager get_vars() 34139 1726867641.09425: done with get_vars() 34139 1726867641.09468: in VariableManager get_vars() 34139 1726867641.09485: done with get_vars() 34139 1726867641.09522: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34139 1726867641.09747: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34139 1726867641.09882: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34139 1726867641.10555: in VariableManager get_vars() 34139 1726867641.10579: done with get_vars() 34139 1726867641.10993: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 34139 1726867641.11139: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34139 1726867641.12861: in VariableManager get_vars() 34139 1726867641.12881: done with get_vars() 34139 1726867641.12886: variable 'omit' from source: magic vars 34139 1726867641.12897: variable 'omit' from source: magic vars 34139 1726867641.12930: in VariableManager get_vars() 34139 1726867641.12959: done with get_vars() 34139 1726867641.12986: in VariableManager get_vars() 34139 1726867641.13002: done with get_vars() 34139 1726867641.13032: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34139 1726867641.13147: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34139 1726867641.13230: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34139 1726867641.15249: in VariableManager get_vars() 34139 1726867641.15271: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34139 1726867641.17331: in VariableManager get_vars() 34139 1726867641.17350: done with get_vars() 34139 1726867641.17354: variable 'omit' from source: magic vars 34139 1726867641.17364: variable 'omit' from source: magic vars 34139 1726867641.17394: in VariableManager get_vars() 34139 1726867641.17418: done with get_vars() 34139 1726867641.17439: in VariableManager get_vars() 34139 1726867641.17458: done with get_vars() 34139 1726867641.17488: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34139 1726867641.17638: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34139 1726867641.17719: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34139 1726867641.18131: in VariableManager get_vars() 34139 1726867641.18153: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34139 1726867641.20130: in VariableManager get_vars() 34139 1726867641.20153: done with get_vars() 34139 1726867641.20157: variable 'omit' from source: magic vars 34139 1726867641.20182: variable 'omit' from source: magic vars 34139 1726867641.20223: in VariableManager get_vars() 34139 1726867641.20241: done with get_vars() 34139 1726867641.20260: in VariableManager get_vars() 34139 1726867641.20280: done with get_vars() 34139 1726867641.20305: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34139 1726867641.20446: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34139 1726867641.20524: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34139 1726867641.20925: in VariableManager get_vars() 34139 1726867641.20950: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34139 1726867641.22966: in VariableManager get_vars() 34139 1726867641.22996: done with get_vars() 34139 1726867641.23038: in VariableManager get_vars() 34139 1726867641.23061: done with get_vars() 34139 1726867641.23128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 34139 1726867641.23147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 34139 1726867641.23426: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 34139 1726867641.23587: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 34139 1726867641.23590: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Isn/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 34139 1726867641.23623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 34139 1726867641.23656: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 34139 1726867641.23828: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 34139 1726867641.23891: Loaded config def from plugin (callback/default) 34139 1726867641.23894: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34139 1726867641.25116: Loaded config def from plugin (callback/junit) 34139 1726867641.25119: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34139 1726867641.25162: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 34139 1726867641.25236: Loaded config def from plugin (callback/minimal) 34139 1726867641.25239: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34139 1726867641.25280: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34139 1726867641.25346: Loaded config def from plugin (callback/tree) 34139 1726867641.25349: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 34139 1726867641.25481: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 34139 1726867641.25484: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Isn/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_wireless_nm.yml ************************************************ 2 plays in /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml 34139 1726867641.25519: in VariableManager get_vars() 34139 1726867641.25533: done with get_vars() 34139 1726867641.25539: in VariableManager get_vars() 34139 1726867641.25548: done with get_vars() 34139 1726867641.25552: variable 'omit' from source: magic vars 34139 1726867641.25589: in VariableManager get_vars() 34139 1726867641.25604: done with get_vars() 34139 1726867641.25630: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_wireless.yml' with nm as provider] ********* 34139 1726867641.26190: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 34139 1726867641.26262: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 34139 1726867641.26300: getting the remaining hosts for this loop 34139 1726867641.26302: done getting the remaining hosts for this loop 34139 1726867641.26304: getting the next task for host managed_node1 34139 1726867641.26308: done getting next task for host managed_node1 34139 1726867641.26312: ^ task is: TASK: Gathering Facts 34139 1726867641.26314: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867641.26316: getting variables 34139 1726867641.26317: in VariableManager get_vars() 34139 1726867641.26326: Calling all_inventory to load vars for managed_node1 34139 1726867641.26329: Calling groups_inventory to load vars for managed_node1 34139 1726867641.26331: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867641.26342: Calling all_plugins_play to load vars for managed_node1 34139 1726867641.26353: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867641.26357: Calling groups_plugins_play to load vars for managed_node1 34139 1726867641.26398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867641.26454: done with get_vars() 34139 1726867641.26461: done getting variables 34139 1726867641.26551: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:6 Friday 20 September 2024 17:27:21 -0400 (0:00:00.011) 0:00:00.011 ****** 34139 1726867641.26571: entering _queue_task() for managed_node1/gather_facts 34139 1726867641.26573: Creating lock for gather_facts 34139 1726867641.26949: worker is 1 (out of 1 available) 34139 1726867641.26961: exiting _queue_task() for managed_node1/gather_facts 34139 1726867641.26974: done queuing things up, now waiting for results queue to drain 34139 1726867641.26976: waiting for pending results... 34139 1726867641.27217: running TaskExecutor() for managed_node1/TASK: Gathering Facts 34139 1726867641.27328: in run() - task 0affcac9-a3a5-c103-b8fd-000000000147 34139 1726867641.27349: variable 'ansible_search_path' from source: unknown 34139 1726867641.27425: calling self._execute() 34139 1726867641.27465: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867641.27476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867641.27493: variable 'omit' from source: magic vars 34139 1726867641.27595: variable 'omit' from source: magic vars 34139 1726867641.27627: variable 'omit' from source: magic vars 34139 1726867641.27750: variable 'omit' from source: magic vars 34139 1726867641.27753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34139 1726867641.27763: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34139 1726867641.27790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34139 1726867641.27816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34139 1726867641.27830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34139 1726867641.27870: variable 'inventory_hostname' from source: host vars for 'managed_node1' 34139 1726867641.27880: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867641.27888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867641.27994: Set connection var ansible_timeout to 10 34139 1726867641.28005: Set connection var ansible_shell_type to sh 34139 1726867641.28015: Set connection var ansible_shell_executable to /bin/sh 34139 1726867641.28025: Set connection var ansible_pipelining to False 34139 1726867641.28036: Set connection var ansible_connection to ssh 34139 1726867641.28045: Set connection var ansible_module_compression to ZIP_DEFLATED 34139 1726867641.28076: variable 'ansible_shell_executable' from source: unknown 34139 1726867641.28182: variable 'ansible_connection' from source: unknown 34139 1726867641.28186: variable 'ansible_module_compression' from source: unknown 34139 1726867641.28188: variable 'ansible_shell_type' from source: unknown 34139 1726867641.28190: variable 'ansible_shell_executable' from source: unknown 34139 1726867641.28193: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867641.28195: variable 'ansible_pipelining' from source: unknown 34139 1726867641.28197: variable 'ansible_timeout' from source: unknown 34139 1726867641.28199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867641.28311: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34139 1726867641.28330: variable 'omit' from source: magic vars 34139 1726867641.28381: starting attempt loop 34139 1726867641.28384: running the handler 34139 1726867641.28387: variable 'ansible_facts' from source: unknown 34139 1726867641.28389: _low_level_execute_command(): starting 34139 1726867641.28391: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34139 1726867641.29114: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34139 1726867641.29131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34139 1726867641.29146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34139 1726867641.29163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34139 1726867641.29203: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34139 1726867641.29216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 34139 1726867641.29228: stderr chunk (state=3): >>>debug2: match found <<< 34139 1726867641.29294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867641.29326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 34139 1726867641.29342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34139 1726867641.29357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867641.29447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867641.31125: stdout chunk (state=3): >>>/root <<< 34139 1726867641.31275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34139 1726867641.31280: stdout chunk (state=3): >>><<< 34139 1726867641.31282: stderr chunk (state=3): >>><<< 34139 1726867641.31386: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34139 1726867641.31390: _low_level_execute_command(): starting 34139 1726867641.31394: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867641.3130667-34167-193049711464773 `" && echo ansible-tmp-1726867641.3130667-34167-193049711464773="` echo /root/.ansible/tmp/ansible-tmp-1726867641.3130667-34167-193049711464773 `" ) && sleep 0' 34139 1726867641.32072: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 34139 1726867641.32163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867641.32200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867641.34080: stdout chunk (state=3): >>>ansible-tmp-1726867641.3130667-34167-193049711464773=/root/.ansible/tmp/ansible-tmp-1726867641.3130667-34167-193049711464773 <<< 34139 1726867641.34190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34139 1726867641.34219: stderr chunk (state=3): >>><<< 34139 1726867641.34223: stdout chunk (state=3): >>><<< 34139 1726867641.34238: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867641.3130667-34167-193049711464773=/root/.ansible/tmp/ansible-tmp-1726867641.3130667-34167-193049711464773 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34139 1726867641.34265: variable 'ansible_module_compression' from source: unknown 34139 1726867641.34308: ANSIBALLZ: Using generic lock for ansible.legacy.setup 34139 1726867641.34314: ANSIBALLZ: Acquiring lock 34139 1726867641.34317: ANSIBALLZ: Lock acquired: 140192904007840 34139 1726867641.34324: ANSIBALLZ: Creating module 34139 1726867641.54833: ANSIBALLZ: Writing module into payload 34139 1726867641.54923: ANSIBALLZ: Writing module 34139 1726867641.54939: ANSIBALLZ: Renaming module 34139 1726867641.54944: ANSIBALLZ: Done creating module 34139 1726867641.54971: variable 'ansible_facts' from source: unknown 34139 1726867641.54976: variable 'inventory_hostname' from source: host vars for 'managed_node1' 34139 1726867641.54988: _low_level_execute_command(): starting 34139 1726867641.54996: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 34139 1726867641.55417: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34139 1726867641.55421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867641.55443: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867641.55483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 34139 1726867641.55496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867641.55556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867641.57243: stdout chunk (state=3): >>>PLATFORM <<< 34139 1726867641.57316: stdout chunk (state=3): >>>Linux <<< 34139 1726867641.57344: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 34139 1726867641.57526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34139 1726867641.57529: stdout chunk (state=3): >>><<< 34139 1726867641.57531: stderr chunk (state=3): >>><<< 34139 1726867641.57668: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34139 1726867641.57673 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 34139 1726867641.57676: _low_level_execute_command(): starting 34139 1726867641.57681: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 34139 1726867641.57820: Sending initial data 34139 1726867641.57823: Sent initial data (1181 bytes) 34139 1726867641.58254: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34139 1726867641.58352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867641.58391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 34139 1726867641.58407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34139 1726867641.58430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867641.58514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867641.61959: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 34139 1726867641.62344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34139 1726867641.62381: stderr chunk (state=3): >>><<< 34139 1726867641.62395: stdout chunk (state=3): >>><<< 34139 1726867641.62583: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34139 1726867641.62586: variable 'ansible_facts' from source: unknown 34139 1726867641.62589: variable 'ansible_facts' from source: unknown 34139 1726867641.62591: variable 'ansible_module_compression' from source: unknown 34139 1726867641.62594: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34139vobchn_u/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34139 1726867641.62601: variable 'ansible_facts' from source: unknown 34139 1726867641.62813: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867641.3130667-34167-193049711464773/AnsiballZ_setup.py 34139 1726867641.63007: Sending initial data 34139 1726867641.63013: Sent initial data (154 bytes) 34139 1726867641.63606: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34139 1726867641.63625: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34139 1726867641.63696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867641.63752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 34139 1726867641.63767: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34139 1726867641.63805: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867641.63873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867641.65497: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34139 1726867641.65534: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34139 1726867641.65620: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34139vobchn_u/tmpkfddvsjr /root/.ansible/tmp/ansible-tmp-1726867641.3130667-34167-193049711464773/AnsiballZ_setup.py <<< 34139 1726867641.65624: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867641.3130667-34167-193049711464773/AnsiballZ_setup.py" <<< 34139 1726867641.65670: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34139vobchn_u/tmpkfddvsjr" to remote "/root/.ansible/tmp/ansible-tmp-1726867641.3130667-34167-193049711464773/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867641.3130667-34167-193049711464773/AnsiballZ_setup.py" <<< 34139 1726867641.67252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34139 1726867641.67297: stderr chunk (state=3): >>><<< 34139 1726867641.67301: stdout chunk (state=3): >>><<< 34139 1726867641.67303: done transferring module to remote 34139 1726867641.67321: _low_level_execute_command(): starting 34139 1726867641.67324: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867641.3130667-34167-193049711464773/ /root/.ansible/tmp/ansible-tmp-1726867641.3130667-34167-193049711464773/AnsiballZ_setup.py && sleep 0' 34139 1726867641.67749: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34139 1726867641.67752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867641.67755: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 34139 1726867641.67757: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34139 1726867641.67763: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867641.67805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 34139 1726867641.67809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867641.67860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867641.69647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34139 1726867641.69682: stderr chunk (state=3): >>><<< 34139 1726867641.69685: stdout chunk (state=3): >>><<< 34139 1726867641.69704: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34139 1726867641.69707: _low_level_execute_command(): starting 34139 1726867641.69709: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867641.3130667-34167-193049711464773/AnsiballZ_setup.py && sleep 0' 34139 1726867641.70116: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34139 1726867641.70119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 34139 1726867641.70122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34139 1726867641.70124: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34139 1726867641.70126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867641.70169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 34139 1726867641.70173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867641.70226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867641.72388: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34139 1726867641.72420: stdout chunk (state=3): >>>import _imp # builtin <<< 34139 1726867641.72446: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 34139 1726867641.72518: stdout chunk (state=3): >>>import '_io' # <<< 34139 1726867641.72540: stdout chunk (state=3): >>>import 'marshal' # <<< 34139 1726867641.72554: stdout chunk (state=3): >>>import 'posix' # <<< 34139 1726867641.72606: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 34139 1726867641.72619: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 34139 1726867641.72684: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 34139 1726867641.72698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 34139 1726867641.72722: stdout chunk (state=3): >>>import 'codecs' # <<< 34139 1726867641.72792: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 34139 1726867641.72833: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95be184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bde7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 34139 1726867641.72838: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95be1aa50> <<< 34139 1726867641.72881: stdout chunk (state=3): >>>import '_signal' # <<< 34139 1726867641.72899: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 34139 1726867641.72904: stdout chunk (state=3): >>>import 'io' # <<< 34139 1726867641.72940: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 34139 1726867641.73024: stdout chunk (state=3): >>>import '_collections_abc' # <<< 34139 1726867641.73056: stdout chunk (state=3): >>>import 'genericpath' # <<< 34139 1726867641.73061: stdout chunk (state=3): >>>import 'posixpath' # <<< 34139 1726867641.73096: stdout chunk (state=3): >>>import 'os' # <<< 34139 1726867641.73101: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 34139 1726867641.73128: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 34139 1726867641.73140: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 34139 1726867641.73144: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 34139 1726867641.73176: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 34139 1726867641.73207: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc2d130> <<< 34139 1726867641.73262: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 34139 1726867641.73273: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867641.73279: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc2dfa0> <<< 34139 1726867641.73307: stdout chunk (state=3): >>>import 'site' # <<< 34139 1726867641.73336: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34139 1726867641.73712: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 34139 1726867641.73718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 34139 1726867641.73751: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 34139 1726867641.73754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867641.73782: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 34139 1726867641.73815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 34139 1726867641.73835: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 34139 1726867641.73862: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 34139 1726867641.73881: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc6be00> <<< 34139 1726867641.73891: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 34139 1726867641.73912: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 34139 1726867641.73933: stdout chunk (state=3): >>>import '_operator' # <<< 34139 1726867641.73939: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc6bec0> <<< 34139 1726867641.73955: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 34139 1726867641.73987: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 34139 1726867641.74007: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 34139 1726867641.74059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867641.74071: stdout chunk (state=3): >>>import 'itertools' # <<< 34139 1726867641.74104: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 34139 1726867641.74111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bca37d0> <<< 34139 1726867641.74131: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 34139 1726867641.74138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 34139 1726867641.74143: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bca3e60> <<< 34139 1726867641.74163: stdout chunk (state=3): >>>import '_collections' # <<< 34139 1726867641.74208: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc83ad0> <<< 34139 1726867641.74221: stdout chunk (state=3): >>>import '_functools' # <<< 34139 1726867641.74246: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc811f0> <<< 34139 1726867641.74334: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc68fb0> <<< 34139 1726867641.74359: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 34139 1726867641.74384: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 34139 1726867641.74389: stdout chunk (state=3): >>>import '_sre' # <<< 34139 1726867641.74419: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 34139 1726867641.74444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 34139 1726867641.74467: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 34139 1726867641.74515: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcc3770> <<< 34139 1726867641.74533: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcc2390> <<< 34139 1726867641.74559: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc82090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcc0bc0> <<< 34139 1726867641.74619: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 34139 1726867641.74640: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcf8800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc68230> <<< 34139 1726867641.74682: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 34139 1726867641.74694: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95bcf8cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcf8b60> <<< 34139 1726867641.74731: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867641.74744: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95bcf8ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc66d50> <<< 34139 1726867641.74780: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867641.74795: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 34139 1726867641.74841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 34139 1726867641.74851: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcf9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcf9250> import 'importlib.machinery' # <<< 34139 1726867641.74893: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 34139 1726867641.74931: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcfa480> import 'importlib.util' # import 'runpy' # <<< 34139 1726867641.74953: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 34139 1726867641.74986: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 34139 1726867641.75010: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bd106b0> <<< 34139 1726867641.75052: stdout chunk (state=3): >>>import 'errno' # <<< 34139 1726867641.75079: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95bd11d90> <<< 34139 1726867641.75111: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 34139 1726867641.75114: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 34139 1726867641.75142: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bd12c30> <<< 34139 1726867641.75190: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867641.75217: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95bd13290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bd12180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 34139 1726867641.75230: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 34139 1726867641.75280: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867641.75291: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95bd13d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bd13440> <<< 34139 1726867641.75330: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcfa4e0> <<< 34139 1726867641.75345: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 34139 1726867641.75379: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 34139 1726867641.75399: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 34139 1726867641.75450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 34139 1726867641.75456: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95ba07bc0> <<< 34139 1726867641.75515: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 34139 1726867641.75519: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867641.75543: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95ba306e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ba30440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95ba30620> <<< 34139 1726867641.75568: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 34139 1726867641.75580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 34139 1726867641.75637: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867641.75765: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95ba30fe0> <<< 34139 1726867641.75892: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95ba31970> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ba30890> <<< 34139 1726867641.75920: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ba05d60> <<< 34139 1726867641.75932: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 34139 1726867641.75976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 34139 1726867641.75995: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 34139 1726867641.76016: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ba32cf0> <<< 34139 1726867641.76027: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ba30e60> <<< 34139 1726867641.76030: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcfabd0> <<< 34139 1726867641.76070: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34139 1726867641.76123: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867641.76223: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 34139 1726867641.76391: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ba5f020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ba833e0> <<< 34139 1726867641.76542: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bae01a0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 34139 1726867641.76545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 34139 1726867641.76573: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 34139 1726867641.76612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 34139 1726867641.76728: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bae28d0> <<< 34139 1726867641.76769: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bae02c0> <<< 34139 1726867641.76806: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95baad190> <<< 34139 1726867641.76893: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b8f11f0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ba821e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ba33bf0> <<< 34139 1726867641.77042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 34139 1726867641.77053: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb95ba828a0> <<< 34139 1726867641.77294: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_5wgx35q_/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 34139 1726867641.77424: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.77467: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 34139 1726867641.77499: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34139 1726867641.77575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 34139 1726867641.77604: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc'<<< 34139 1726867641.77607: stdout chunk (state=3): >>> import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b952f00> import '_typing' # <<< 34139 1726867641.77802: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b931df0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b930f50> <<< 34139 1726867641.77806: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.77834: stdout chunk (state=3): >>>import 'ansible' # <<< 34139 1726867641.77856: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.77859: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.77889: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.77893: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 34139 1726867641.77910: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.79322: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.80569: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b950dd0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 34139 1726867641.80609: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b98a780> <<< 34139 1726867641.80645: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b98a510> <<< 34139 1726867641.80658: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b989e20> <<< 34139 1726867641.80674: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 34139 1726867641.80711: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b98a840> <<< 34139 1726867641.80721: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b953b90> import 'atexit' # <<< 34139 1726867641.80750: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b98b4a0> <<< 34139 1726867641.80785: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867641.80801: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b98b650> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 34139 1726867641.80857: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 34139 1726867641.80867: stdout chunk (state=3): >>>import '_locale' # <<< 34139 1726867641.80932: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b98bb90> <<< 34139 1726867641.80946: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 34139 1726867641.80969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 34139 1726867641.81012: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b32d970> <<< 34139 1726867641.81041: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b32f590> <<< 34139 1726867641.81061: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 34139 1726867641.81085: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 34139 1726867641.81123: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b32ff20> <<< 34139 1726867641.81137: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 34139 1726867641.81169: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 34139 1726867641.81184: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b335100> <<< 34139 1726867641.81201: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 34139 1726867641.81236: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 34139 1726867641.81256: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 34139 1726867641.81312: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b337bc0> <<< 34139 1726867641.81361: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b932ff0> <<< 34139 1726867641.81371: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b335e80> <<< 34139 1726867641.81395: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 34139 1726867641.81433: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 34139 1726867641.81451: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 34139 1726867641.81469: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 34139 1726867641.81581: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 34139 1726867641.81613: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b33bb90> <<< 34139 1726867641.81638: stdout chunk (state=3): >>>import '_tokenize' # <<< 34139 1726867641.81708: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b33a660> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b33a3c0> <<< 34139 1726867641.81735: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 34139 1726867641.81738: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 34139 1726867641.81805: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b33a930> <<< 34139 1726867641.81836: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b336390> <<< 34139 1726867641.81871: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b37fe90> <<< 34139 1726867641.81902: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b37f890> <<< 34139 1726867641.81937: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 34139 1726867641.81943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 34139 1726867641.81963: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 34139 1726867641.82005: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b381a60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b381820> <<< 34139 1726867641.82021: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34139 1726867641.82054: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 34139 1726867641.82110: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867641.82130: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b383fe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b382150> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34139 1726867641.82183: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867641.82218: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 34139 1726867641.82223: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 34139 1726867641.82265: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b387680> <<< 34139 1726867641.82388: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b384140> <<< 34139 1726867641.82463: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b388a10> <<< 34139 1726867641.82485: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b3888c0> <<< 34139 1726867641.82527: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b3880b0> <<< 34139 1726867641.82555: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b3801d0> <<< 34139 1726867641.82575: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 34139 1726867641.82602: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 34139 1726867641.82612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 34139 1726867641.82639: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867641.82673: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b214170> <<< 34139 1726867641.82825: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867641.82838: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b2153a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b38a8d0> <<< 34139 1726867641.82880: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b38bc50> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b38a4e0> <<< 34139 1726867641.82916: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.82931: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 34139 1726867641.83027: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.83122: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.83127: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 34139 1726867641.83170: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 34139 1726867641.83173: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.83366: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.83419: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.83981: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.84614: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 34139 1726867641.84641: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867641.84649: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b2194c0> <<< 34139 1726867641.84728: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 34139 1726867641.84748: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b21a1e0> <<< 34139 1726867641.84764: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b2155b0> <<< 34139 1726867641.84806: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 34139 1726867641.84838: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.84866: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.84869: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 34139 1726867641.85007: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.85183: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 34139 1726867641.85194: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b21a180> # zipimport: zlib available <<< 34139 1726867641.85649: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.86093: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.86168: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.86255: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 34139 1726867641.86258: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.86294: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.86330: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 34139 1726867641.86341: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.86402: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.86498: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 34139 1726867641.86530: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 34139 1726867641.86578: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.86630: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 34139 1726867641.86634: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.86926: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.87085: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 34139 1726867641.87160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 34139 1726867641.87164: stdout chunk (state=3): >>>import '_ast' # <<< 34139 1726867641.87244: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b21b2f0> <<< 34139 1726867641.87247: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.87311: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.87387: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 34139 1726867641.87420: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 34139 1726867641.87433: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.87468: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.87515: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 34139 1726867641.87569: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.87612: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.87671: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.87736: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 34139 1726867641.87784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867641.87874: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b225d60> <<< 34139 1726867641.87905: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b2215b0> <<< 34139 1726867641.87938: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 34139 1726867641.87957: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.88017: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.88094: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.88125: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.88159: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 34139 1726867641.88181: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 34139 1726867641.88212: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 34139 1726867641.88229: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 34139 1726867641.88315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 34139 1726867641.88326: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 34139 1726867641.88369: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b30e780> <<< 34139 1726867641.88420: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b3fa480> <<< 34139 1726867641.88504: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b225dc0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b388e60> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 34139 1726867641.88520: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.88557: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.88585: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 34139 1726867641.88645: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 34139 1726867641.88648: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.88680: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 34139 1726867641.88735: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.89007: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 34139 1726867641.89080: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.89154: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.89174: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.89201: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 34139 1726867641.89220: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.89391: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.89565: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.89602: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.89661: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867641.89695: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 34139 1726867641.89716: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 34139 1726867641.89729: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 34139 1726867641.89749: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 34139 1726867641.89774: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b2b9e50> <<< 34139 1726867641.89802: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 34139 1726867641.89823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 34139 1726867641.89872: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 34139 1726867641.89901: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 34139 1726867641.89924: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 34139 1726867641.89947: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ae3bdd0> <<< 34139 1726867641.89969: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95ae403e0> <<< 34139 1726867641.90023: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b2a2c30> <<< 34139 1726867641.90036: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b2ba9c0> <<< 34139 1726867641.90068: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b2b8530> <<< 34139 1726867641.90093: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b2b8f20> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 34139 1726867641.90148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 34139 1726867641.90172: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 34139 1726867641.90203: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 34139 1726867641.90248: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95ae430b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ae42990> <<< 34139 1726867641.90282: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867641.90293: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95ae42b40> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ae41e20> <<< 34139 1726867641.90322: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 34139 1726867641.90447: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 34139 1726867641.90451: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ae431a0> <<< 34139 1726867641.90469: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 34139 1726867641.90495: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 34139 1726867641.90527: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95aeadca0> <<< 34139 1726867641.90555: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ae43c80> <<< 34139 1726867641.90597: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b2b81d0> import 'ansible.module_utils.facts.timeout' # <<< 34139 1726867641.90624: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 34139 1726867641.90634: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 34139 1726867641.90661: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.90709: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.90771: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 34139 1726867641.90791: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.90834: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.90883: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 34139 1726867641.90919: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 34139 1726867641.90939: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.90964: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.91000: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 34139 1726867641.91057: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.91112: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 34139 1726867641.91164: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.91211: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 34139 1726867641.91221: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.91271: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.91332: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.91388: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.91446: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 34139 1726867641.91468: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.91943: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.92384: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 34139 1726867641.92445: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.92495: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.92531: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.92584: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 34139 1726867641.92590: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.92614: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.92647: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 34139 1726867641.92673: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.92704: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.92763: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 34139 1726867641.92800: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.92819: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.92846: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 34139 1726867641.92861: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.92910: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.92914: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 34139 1726867641.92934: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.93018: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.93088: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 34139 1726867641.93106: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95aead9d0> <<< 34139 1726867641.93144: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 34139 1726867641.93159: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 34139 1726867641.93288: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95aeae990> import 'ansible.module_utils.facts.system.local' # <<< 34139 1726867641.93301: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.93369: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.93427: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 34139 1726867641.93524: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.93619: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 34139 1726867641.93630: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.93884: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.93910: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 34139 1726867641.93966: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867641.94023: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95aeddfa0> <<< 34139 1726867641.94208: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95aecdd30> import 'ansible.module_utils.facts.system.python' # <<< 34139 1726867641.94229: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.94279: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.94332: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 34139 1726867641.94355: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.94421: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.94508: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.94616: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.94763: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 34139 1726867641.94790: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.94816: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.94857: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 34139 1726867641.94883: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.94908: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.94962: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 34139 1726867641.94986: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 34139 1726867641.95021: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867641.95044: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95aef1820> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95aecf0b0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 34139 1726867641.95061: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 34139 1726867641.95082: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.95108: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.95160: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 34139 1726867641.95167: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.95315: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.95476: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 34139 1726867641.95481: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.95575: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.95684: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.95722: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.95780: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 34139 1726867641.95819: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.95822: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.95833: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.95982: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.96131: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 34139 1726867641.96143: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.96259: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.96381: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 34139 1726867641.96395: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.96423: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.96460: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.97019: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.97705: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available <<< 34139 1726867641.97737: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 34139 1726867641.97758: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.97847: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.97947: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 34139 1726867641.97965: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.98105: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.98256: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 34139 1726867641.98298: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 34139 1726867641.98314: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.98356: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.98405: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 34139 1726867641.98411: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.98498: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.98599: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.98800: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.99005: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 34139 1726867641.99029: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.99053: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.99093: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 34139 1726867641.99135: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.99158: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 34139 1726867641.99230: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.99303: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 34139 1726867641.99324: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.99352: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.99375: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 34139 1726867641.99389: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.99422: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.99489: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 34139 1726867641.99551: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.99608: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 34139 1726867641.99627: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867641.99882: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.00164: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 34139 1726867642.00170: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.00220: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.00283: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 34139 1726867642.00311: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.00331: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.00365: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 34139 1726867642.00368: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.00423: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.00476: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 34139 1726867642.00512: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 34139 1726867642.00598: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.00699: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 34139 1726867642.00702: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34139 1726867642.00728: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 34139 1726867642.00766: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.00805: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 34139 1726867642.00842: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.00861: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.00904: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.00953: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.01022: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.01104: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 34139 1726867642.01125: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.01166: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.01219: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 34139 1726867642.01411: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.01616: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 34139 1726867642.01620: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.01662: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.01705: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 34139 1726867642.01759: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.01817: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 34139 1726867642.01821: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.01900: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.01995: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 34139 1726867642.01998: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.02075: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.02165: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 34139 1726867642.02174: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 34139 1726867642.02243: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.02922: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 34139 1726867642.02937: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 34139 1726867642.02955: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 34139 1726867642.02965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 34139 1726867642.03002: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867642.03015: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95acf1e50> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95acf3ec0> <<< 34139 1726867642.03058: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95aceb2c0> <<< 34139 1726867642.16056: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 34139 1726867642.16061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 34139 1726867642.16085: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ad38530> <<< 34139 1726867642.16113: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 34139 1726867642.16134: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ad38950> <<< 34139 1726867642.16183: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 34139 1726867642.16209: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867642.16239: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ad39d60> <<< 34139 1726867642.16275: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ad39880> <<< 34139 1726867642.16515: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 34139 1726867642.42004: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7JVDfMeZKYw4NvDf4J6T4eu3duEI1TDN8eY5Ag46A+Ty47bFYfPmW8jVxlz3g+Tlfs7803yjUxR8BhfnXFZj/ShR0Zt/NELUYUVHxS02yzVAX46Y/KQOzI9qRt8tn6zOckZ/+JxKdaH4KujKn7hn6gshq1vw8EYiHTG0Qh6hfm5GPWLD5l6fJeToO5P4jLX8zZS6NMoZR+K0P0F/xOkWEwjI1nJbD4GE/YiqzqLHq6U6rqEJJJWonNID6UzPfdWm+n8LyKoVCKBkDEBVl2RUr8Tsnq4MvYG+29djt/3smMIshPKMV+5fzmOvIUzv2YNfQB8w6aFoUnL8qSaEvV8A/30HdDOfRMCUanxsl1eSz0oMgGgwuQGW+lT1FSzP9U9mEQM92nj5Xgp0vf3oGttMW7RHoOjnkx3T8GVpOuPHnV0/Za7EXFaFun607WeBN2SsoO8UQ5HyKRLlC6ISzWOkWAc0L6v/tAMtxHQG5Bp40E0MHFpDc2SEbbFD+SVTfFQM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV4LdcoMAl+JydFQSAxZ6GfPzd/6UfaeOa/SPTjnrI5J8u4+cAsuyFQSKSblfcVNXleTIvzCZHrC699g4HQaHE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII78+YWuBOZy60GFrh19oZTZhmiNQUWzC28D2cLLUyoq", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-57", "ansible_nodename": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec293fb3626e3a20695ae06b45478339", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 32980 10.31.12.57 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 32980 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2958, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 573, "free": 2958}, "nocache": {"free": 3299, "used": 232}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_uuid": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 887, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789585408, "block_size": 4096, "block_total": 65519099, "block_available": 63913473, "block_used": 1605626, "inode_total": 131070960, "inode_available": 131029044, "inode_used": 41916, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "rpltstbr", "eth0"], "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "22:17:5e:4f:0a:7e", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:feff:fed3:7d4f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.12.57"], "ansible_all_ipv6_addresses": ["fe80::8ff:feff:fed3:7d4f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.57", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::8ff:feff:fed3:7d4f"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "27", "second": "22", "epoch": "1726867642", "epoch_int": "1726867642", "date": "2024-09-20", "time": "17:27:22", "iso8601_micro": "2024-09-20T21:27:22.411669Z", "iso8601": "2024-09-20T21:27:22Z", "iso8601_basic": "20240920T172722411669", "iso8601_basic_short": "20240920T172722", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.4638671875, "5m": 0.51708984375, "15m": 0.30712890625}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34139 1726867642.42261: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 34139 1726867642.42269: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum<<< 34139 1726867642.42424: stdout chunk (state=3): >>> # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib <<< 34139 1726867642.42439: stdout chunk (state=3): >>># cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog <<< 34139 1726867642.42445: stdout chunk (state=3): >>># cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string <<< 34139 1726867642.42449: stdout chunk (state=3): >>># cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six <<< 34139 1726867642.42453: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 34139 1726867642.42456: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 34139 1726867642.42583: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 34139 1726867642.42840: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 34139 1726867642.43020: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 34139 1726867642.43024: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 34139 1726867642.43083: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 34139 1726867642.43092: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 34139 1726867642.43132: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 34139 1726867642.43166: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle <<< 34139 1726867642.43213: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction <<< 34139 1726867642.43345: stdout chunk (state=3): >>># destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 34139 1726867642.43349: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 34139 1726867642.43412: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 <<< 34139 1726867642.43465: stdout chunk (state=3): >>># cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 34139 1726867642.43566: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections <<< 34139 1726867642.43570: stdout chunk (state=3): >>># cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux <<< 34139 1726867642.43592: stdout chunk (state=3): >>># destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34139 1726867642.43748: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 34139 1726867642.43806: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing <<< 34139 1726867642.43814: stdout chunk (state=3): >>># destroy _tokenize <<< 34139 1726867642.43868: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34139 1726867642.43934: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections <<< 34139 1726867642.43994: stdout chunk (state=3): >>># destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 34139 1726867642.44095: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 34139 1726867642.44412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34139 1726867642.44549: stderr chunk (state=3): >>>Shared connection to 10.31.12.57 closed. <<< 34139 1726867642.44574: stderr chunk (state=3): >>><<< 34139 1726867642.44579: stdout chunk (state=3): >>><<< 34139 1726867642.45040: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95be184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bde7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95be1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc2dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc6be00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc6bec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bca37d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bca3e60> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc83ad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc811f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc68fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcc3770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcc2390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc82090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcc0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcf8800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc68230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95bcf8cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcf8b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95bcf8ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bc66d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcf9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcf9250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcfa480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bd106b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95bd11d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bd12c30> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95bd13290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bd12180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95bd13d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bd13440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcfa4e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95ba07bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95ba306e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ba30440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95ba30620> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95ba30fe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95ba31970> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ba30890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ba05d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ba32cf0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ba30e60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bcfabd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ba5f020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ba833e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bae01a0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bae28d0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95bae02c0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95baad190> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b8f11f0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ba821e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ba33bf0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb95ba828a0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_5wgx35q_/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b952f00> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b931df0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b930f50> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b950dd0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b98a780> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b98a510> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b989e20> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b98a840> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b953b90> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b98b4a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b98b650> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b98bb90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b32d970> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b32f590> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b32ff20> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b335100> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b337bc0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b932ff0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b335e80> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b33bb90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b33a660> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b33a3c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b33a930> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b336390> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b37fe90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b37f890> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b381a60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b381820> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b383fe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b382150> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b387680> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b384140> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b388a10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b3888c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b3880b0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b3801d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b214170> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b2153a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b38a8d0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b38bc50> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b38a4e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b2194c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b21a1e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b2155b0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b21a180> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b21b2f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95b225d60> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b2215b0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b30e780> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b3fa480> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b225dc0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b388e60> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b2b9e50> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ae3bdd0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95ae403e0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b2a2c30> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b2ba9c0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b2b8530> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b2b8f20> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95ae430b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ae42990> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95ae42b40> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ae41e20> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ae431a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95aeadca0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ae43c80> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95b2b81d0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95aead9d0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95aeae990> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95aeddfa0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95aecdd30> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95aef1820> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95aecf0b0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb95acf1e50> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95acf3ec0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95aceb2c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ad38530> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ad38950> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ad39d60> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb95ad39880> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7JVDfMeZKYw4NvDf4J6T4eu3duEI1TDN8eY5Ag46A+Ty47bFYfPmW8jVxlz3g+Tlfs7803yjUxR8BhfnXFZj/ShR0Zt/NELUYUVHxS02yzVAX46Y/KQOzI9qRt8tn6zOckZ/+JxKdaH4KujKn7hn6gshq1vw8EYiHTG0Qh6hfm5GPWLD5l6fJeToO5P4jLX8zZS6NMoZR+K0P0F/xOkWEwjI1nJbD4GE/YiqzqLHq6U6rqEJJJWonNID6UzPfdWm+n8LyKoVCKBkDEBVl2RUr8Tsnq4MvYG+29djt/3smMIshPKMV+5fzmOvIUzv2YNfQB8w6aFoUnL8qSaEvV8A/30HdDOfRMCUanxsl1eSz0oMgGgwuQGW+lT1FSzP9U9mEQM92nj5Xgp0vf3oGttMW7RHoOjnkx3T8GVpOuPHnV0/Za7EXFaFun607WeBN2SsoO8UQ5HyKRLlC6ISzWOkWAc0L6v/tAMtxHQG5Bp40E0MHFpDc2SEbbFD+SVTfFQM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV4LdcoMAl+JydFQSAxZ6GfPzd/6UfaeOa/SPTjnrI5J8u4+cAsuyFQSKSblfcVNXleTIvzCZHrC699g4HQaHE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII78+YWuBOZy60GFrh19oZTZhmiNQUWzC28D2cLLUyoq", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-57", "ansible_nodename": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec293fb3626e3a20695ae06b45478339", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 32980 10.31.12.57 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 32980 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2958, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 573, "free": 2958}, "nocache": {"free": 3299, "used": 232}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_uuid": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 887, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789585408, "block_size": 4096, "block_total": 65519099, "block_available": 63913473, "block_used": 1605626, "inode_total": 131070960, "inode_available": 131029044, "inode_used": 41916, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "rpltstbr", "eth0"], "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "22:17:5e:4f:0a:7e", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:feff:fed3:7d4f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.72", "10.31.12.57"], "ansible_all_ipv6_addresses": ["fe80::8ff:feff:fed3:7d4f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.57", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::8ff:feff:fed3:7d4f"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "27", "second": "22", "epoch": "1726867642", "epoch_int": "1726867642", "date": "2024-09-20", "time": "17:27:22", "iso8601_micro": "2024-09-20T21:27:22.411669Z", "iso8601": "2024-09-20T21:27:22Z", "iso8601_basic": "20240920T172722411669", "iso8601_basic_short": "20240920T172722", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.4638671875, "5m": 0.51708984375, "15m": 0.30712890625}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 34139 1726867642.46733: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867641.3130667-34167-193049711464773/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34139 1726867642.46736: _low_level_execute_command(): starting 34139 1726867642.46738: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867641.3130667-34167-193049711464773/ > /dev/null 2>&1 && sleep 0' 34139 1726867642.47027: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34139 1726867642.47064: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867642.47082: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34139 1726867642.47122: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867642.47195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 34139 1726867642.47250: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34139 1726867642.47254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867642.47304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867642.49170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34139 1726867642.49174: stdout chunk (state=3): >>><<< 34139 1726867642.49176: stderr chunk (state=3): >>><<< 34139 1726867642.49196: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34139 1726867642.49382: handler run complete 34139 1726867642.49385: variable 'ansible_facts' from source: unknown 34139 1726867642.49464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867642.49821: variable 'ansible_facts' from source: unknown 34139 1726867642.49914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867642.50063: attempt loop complete, returning result 34139 1726867642.50072: _execute() done 34139 1726867642.50084: dumping result to json 34139 1726867642.50125: done dumping result, returning 34139 1726867642.50138: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affcac9-a3a5-c103-b8fd-000000000147] 34139 1726867642.50146: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000147 ok: [managed_node1] 34139 1726867642.50983: no more pending results, returning what we have 34139 1726867642.50987: results queue empty 34139 1726867642.50988: checking for any_errors_fatal 34139 1726867642.50989: done checking for any_errors_fatal 34139 1726867642.50990: checking for max_fail_percentage 34139 1726867642.50991: done checking for max_fail_percentage 34139 1726867642.50992: checking to see if all hosts have failed and the running result is not ok 34139 1726867642.50993: done checking to see if all hosts have failed 34139 1726867642.50993: getting the remaining hosts for this loop 34139 1726867642.50995: done getting the remaining hosts for this loop 34139 1726867642.50998: getting the next task for host managed_node1 34139 1726867642.51005: done getting next task for host managed_node1 34139 1726867642.51006: ^ task is: TASK: meta (flush_handlers) 34139 1726867642.51010: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867642.51014: getting variables 34139 1726867642.51016: in VariableManager get_vars() 34139 1726867642.51036: Calling all_inventory to load vars for managed_node1 34139 1726867642.51039: Calling groups_inventory to load vars for managed_node1 34139 1726867642.51042: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867642.51048: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000147 34139 1726867642.51050: WORKER PROCESS EXITING 34139 1726867642.51059: Calling all_plugins_play to load vars for managed_node1 34139 1726867642.51061: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867642.51064: Calling groups_plugins_play to load vars for managed_node1 34139 1726867642.51255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867642.51482: done with get_vars() 34139 1726867642.51492: done getting variables 34139 1726867642.51557: in VariableManager get_vars() 34139 1726867642.51565: Calling all_inventory to load vars for managed_node1 34139 1726867642.51567: Calling groups_inventory to load vars for managed_node1 34139 1726867642.51570: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867642.51574: Calling all_plugins_play to load vars for managed_node1 34139 1726867642.51576: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867642.51581: Calling groups_plugins_play to load vars for managed_node1 34139 1726867642.51733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867642.51940: done with get_vars() 34139 1726867642.51953: done queuing things up, now waiting for results queue to drain 34139 1726867642.51955: results queue empty 34139 1726867642.51956: checking for any_errors_fatal 34139 1726867642.51958: done checking for any_errors_fatal 34139 1726867642.51959: checking for max_fail_percentage 34139 1726867642.51960: done checking for max_fail_percentage 34139 1726867642.51966: checking to see if all hosts have failed and the running result is not ok 34139 1726867642.51967: done checking to see if all hosts have failed 34139 1726867642.51968: getting the remaining hosts for this loop 34139 1726867642.51968: done getting the remaining hosts for this loop 34139 1726867642.51971: getting the next task for host managed_node1 34139 1726867642.51975: done getting next task for host managed_node1 34139 1726867642.51979: ^ task is: TASK: Include the task 'el_repo_setup.yml' 34139 1726867642.51981: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867642.51983: getting variables 34139 1726867642.51984: in VariableManager get_vars() 34139 1726867642.51991: Calling all_inventory to load vars for managed_node1 34139 1726867642.51993: Calling groups_inventory to load vars for managed_node1 34139 1726867642.51995: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867642.52000: Calling all_plugins_play to load vars for managed_node1 34139 1726867642.52002: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867642.52005: Calling groups_plugins_play to load vars for managed_node1 34139 1726867642.52155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867642.52367: done with get_vars() 34139 1726867642.52374: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:11 Friday 20 September 2024 17:27:22 -0400 (0:00:01.258) 0:00:01.270 ****** 34139 1726867642.52440: entering _queue_task() for managed_node1/include_tasks 34139 1726867642.52442: Creating lock for include_tasks 34139 1726867642.52744: worker is 1 (out of 1 available) 34139 1726867642.52758: exiting _queue_task() for managed_node1/include_tasks 34139 1726867642.52769: done queuing things up, now waiting for results queue to drain 34139 1726867642.52771: waiting for pending results... 34139 1726867642.53041: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 34139 1726867642.53131: in run() - task 0affcac9-a3a5-c103-b8fd-000000000006 34139 1726867642.53155: variable 'ansible_search_path' from source: unknown 34139 1726867642.53195: calling self._execute() 34139 1726867642.53273: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867642.53280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867642.53294: variable 'omit' from source: magic vars 34139 1726867642.53413: _execute() done 34139 1726867642.53416: dumping result to json 34139 1726867642.53419: done dumping result, returning 34139 1726867642.53422: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0affcac9-a3a5-c103-b8fd-000000000006] 34139 1726867642.53429: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000006 34139 1726867642.53523: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000006 34139 1726867642.53526: WORKER PROCESS EXITING 34139 1726867642.53570: no more pending results, returning what we have 34139 1726867642.53584: in VariableManager get_vars() 34139 1726867642.53619: Calling all_inventory to load vars for managed_node1 34139 1726867642.53622: Calling groups_inventory to load vars for managed_node1 34139 1726867642.53625: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867642.53638: Calling all_plugins_play to load vars for managed_node1 34139 1726867642.53641: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867642.53645: Calling groups_plugins_play to load vars for managed_node1 34139 1726867642.53984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867642.54170: done with get_vars() 34139 1726867642.54380: variable 'ansible_search_path' from source: unknown 34139 1726867642.54394: we have included files to process 34139 1726867642.54395: generating all_blocks data 34139 1726867642.54396: done generating all_blocks data 34139 1726867642.54397: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34139 1726867642.54398: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34139 1726867642.54401: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34139 1726867642.55067: in VariableManager get_vars() 34139 1726867642.55084: done with get_vars() 34139 1726867642.55097: done processing included file 34139 1726867642.55098: iterating over new_blocks loaded from include file 34139 1726867642.55100: in VariableManager get_vars() 34139 1726867642.55111: done with get_vars() 34139 1726867642.55113: filtering new block on tags 34139 1726867642.55126: done filtering new block on tags 34139 1726867642.55129: in VariableManager get_vars() 34139 1726867642.55138: done with get_vars() 34139 1726867642.55140: filtering new block on tags 34139 1726867642.55154: done filtering new block on tags 34139 1726867642.55156: in VariableManager get_vars() 34139 1726867642.55165: done with get_vars() 34139 1726867642.55166: filtering new block on tags 34139 1726867642.55180: done filtering new block on tags 34139 1726867642.55182: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 34139 1726867642.55187: extending task lists for all hosts with included blocks 34139 1726867642.55234: done extending task lists 34139 1726867642.55236: done processing included files 34139 1726867642.55236: results queue empty 34139 1726867642.55237: checking for any_errors_fatal 34139 1726867642.55238: done checking for any_errors_fatal 34139 1726867642.55239: checking for max_fail_percentage 34139 1726867642.55240: done checking for max_fail_percentage 34139 1726867642.55240: checking to see if all hosts have failed and the running result is not ok 34139 1726867642.55241: done checking to see if all hosts have failed 34139 1726867642.55241: getting the remaining hosts for this loop 34139 1726867642.55242: done getting the remaining hosts for this loop 34139 1726867642.55245: getting the next task for host managed_node1 34139 1726867642.55248: done getting next task for host managed_node1 34139 1726867642.55250: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 34139 1726867642.55253: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867642.55256: getting variables 34139 1726867642.55257: in VariableManager get_vars() 34139 1726867642.55265: Calling all_inventory to load vars for managed_node1 34139 1726867642.55267: Calling groups_inventory to load vars for managed_node1 34139 1726867642.55269: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867642.55274: Calling all_plugins_play to load vars for managed_node1 34139 1726867642.55276: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867642.55281: Calling groups_plugins_play to load vars for managed_node1 34139 1726867642.55445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867642.55656: done with get_vars() 34139 1726867642.55664: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 17:27:22 -0400 (0:00:00.032) 0:00:01.303 ****** 34139 1726867642.55728: entering _queue_task() for managed_node1/setup 34139 1726867642.55957: worker is 1 (out of 1 available) 34139 1726867642.55967: exiting _queue_task() for managed_node1/setup 34139 1726867642.56182: done queuing things up, now waiting for results queue to drain 34139 1726867642.56184: waiting for pending results... 34139 1726867642.56311: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 34139 1726867642.56316: in run() - task 0affcac9-a3a5-c103-b8fd-000000000158 34139 1726867642.56318: variable 'ansible_search_path' from source: unknown 34139 1726867642.56325: variable 'ansible_search_path' from source: unknown 34139 1726867642.56362: calling self._execute() 34139 1726867642.56438: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867642.56449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867642.56463: variable 'omit' from source: magic vars 34139 1726867642.56971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34139 1726867642.59335: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34139 1726867642.59347: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34139 1726867642.59387: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34139 1726867642.59428: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34139 1726867642.59462: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34139 1726867642.59544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34139 1726867642.59587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34139 1726867642.59621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34139 1726867642.59671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34139 1726867642.59772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34139 1726867642.59859: variable 'ansible_facts' from source: unknown 34139 1726867642.59935: variable 'network_test_required_facts' from source: task vars 34139 1726867642.59972: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 34139 1726867642.59986: variable 'omit' from source: magic vars 34139 1726867642.60026: variable 'omit' from source: magic vars 34139 1726867642.60060: variable 'omit' from source: magic vars 34139 1726867642.60088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34139 1726867642.60123: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34139 1726867642.60146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34139 1726867642.60168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34139 1726867642.60211: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34139 1726867642.60224: variable 'inventory_hostname' from source: host vars for 'managed_node1' 34139 1726867642.60231: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867642.60317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867642.60338: Set connection var ansible_timeout to 10 34139 1726867642.60349: Set connection var ansible_shell_type to sh 34139 1726867642.60358: Set connection var ansible_shell_executable to /bin/sh 34139 1726867642.60367: Set connection var ansible_pipelining to False 34139 1726867642.60375: Set connection var ansible_connection to ssh 34139 1726867642.60386: Set connection var ansible_module_compression to ZIP_DEFLATED 34139 1726867642.60414: variable 'ansible_shell_executable' from source: unknown 34139 1726867642.60425: variable 'ansible_connection' from source: unknown 34139 1726867642.60432: variable 'ansible_module_compression' from source: unknown 34139 1726867642.60438: variable 'ansible_shell_type' from source: unknown 34139 1726867642.60446: variable 'ansible_shell_executable' from source: unknown 34139 1726867642.60452: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867642.60461: variable 'ansible_pipelining' from source: unknown 34139 1726867642.60468: variable 'ansible_timeout' from source: unknown 34139 1726867642.60475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867642.60607: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34139 1726867642.60626: variable 'omit' from source: magic vars 34139 1726867642.60765: starting attempt loop 34139 1726867642.60769: running the handler 34139 1726867642.60772: _low_level_execute_command(): starting 34139 1726867642.60774: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34139 1726867642.61354: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34139 1726867642.61370: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34139 1726867642.61403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34139 1726867642.61470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867642.61531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 34139 1726867642.61548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34139 1726867642.61574: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867642.61659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867642.63314: stdout chunk (state=3): >>>/root <<< 34139 1726867642.63452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34139 1726867642.63466: stdout chunk (state=3): >>><<< 34139 1726867642.63489: stderr chunk (state=3): >>><<< 34139 1726867642.63524: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34139 1726867642.63621: _low_level_execute_command(): starting 34139 1726867642.63625: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867642.635362-34224-172474604501522 `" && echo ansible-tmp-1726867642.635362-34224-172474604501522="` echo /root/.ansible/tmp/ansible-tmp-1726867642.635362-34224-172474604501522 `" ) && sleep 0' 34139 1726867642.64176: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34139 1726867642.64195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34139 1726867642.64214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34139 1726867642.64292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867642.64346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 34139 1726867642.64362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34139 1726867642.64389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867642.64472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867642.66373: stdout chunk (state=3): >>>ansible-tmp-1726867642.635362-34224-172474604501522=/root/.ansible/tmp/ansible-tmp-1726867642.635362-34224-172474604501522 <<< 34139 1726867642.66535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34139 1726867642.66540: stdout chunk (state=3): >>><<< 34139 1726867642.66543: stderr chunk (state=3): >>><<< 34139 1726867642.66685: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867642.635362-34224-172474604501522=/root/.ansible/tmp/ansible-tmp-1726867642.635362-34224-172474604501522 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34139 1726867642.66689: variable 'ansible_module_compression' from source: unknown 34139 1726867642.66691: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34139vobchn_u/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34139 1726867642.67056: variable 'ansible_facts' from source: unknown 34139 1726867642.67140: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867642.635362-34224-172474604501522/AnsiballZ_setup.py 34139 1726867642.67364: Sending initial data 34139 1726867642.67373: Sent initial data (153 bytes) 34139 1726867642.67851: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34139 1726867642.67865: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34139 1726867642.67882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34139 1726867642.67983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 34139 1726867642.68004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867642.68079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867642.69626: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34139 1726867642.69668: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34139 1726867642.69707: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34139vobchn_u/tmpf68opaub /root/.ansible/tmp/ansible-tmp-1726867642.635362-34224-172474604501522/AnsiballZ_setup.py <<< 34139 1726867642.69716: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867642.635362-34224-172474604501522/AnsiballZ_setup.py" <<< 34139 1726867642.69780: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34139vobchn_u/tmpf68opaub" to remote "/root/.ansible/tmp/ansible-tmp-1726867642.635362-34224-172474604501522/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867642.635362-34224-172474604501522/AnsiballZ_setup.py" <<< 34139 1726867642.72298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34139 1726867642.72307: stdout chunk (state=3): >>><<< 34139 1726867642.72318: stderr chunk (state=3): >>><<< 34139 1726867642.72368: done transferring module to remote 34139 1726867642.72485: _low_level_execute_command(): starting 34139 1726867642.72489: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867642.635362-34224-172474604501522/ /root/.ansible/tmp/ansible-tmp-1726867642.635362-34224-172474604501522/AnsiballZ_setup.py && sleep 0' 34139 1726867642.73571: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34139 1726867642.73592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34139 1726867642.73606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34139 1726867642.73630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34139 1726867642.73662: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 34139 1726867642.73675: stderr chunk (state=3): >>>debug2: match not found <<< 34139 1726867642.73693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867642.73716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34139 1726867642.73729: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 34139 1726867642.73754: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34139 1726867642.73790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867642.73868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 34139 1726867642.73891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34139 1726867642.73918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867642.74029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867642.75863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34139 1726867642.75866: stdout chunk (state=3): >>><<< 34139 1726867642.75868: stderr chunk (state=3): >>><<< 34139 1726867642.75991: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34139 1726867642.75995: _low_level_execute_command(): starting 34139 1726867642.75997: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867642.635362-34224-172474604501522/AnsiballZ_setup.py && sleep 0' 34139 1726867642.77123: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867642.77169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 34139 1726867642.77172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867642.77493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867642.79621: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34139 1726867642.79638: stdout chunk (state=3): >>>import _imp # builtin <<< 34139 1726867642.79675: stdout chunk (state=3): >>>import '_thread' # <<< 34139 1726867642.79681: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 34139 1726867642.79738: stdout chunk (state=3): >>>import '_io' # <<< 34139 1726867642.79749: stdout chunk (state=3): >>>import 'marshal' # <<< 34139 1726867642.79846: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # <<< 34139 1726867642.79849: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 34139 1726867642.79941: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 34139 1726867642.79974: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 34139 1726867642.80058: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38259684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825937b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382596aa50> <<< 34139 1726867642.80078: stdout chunk (state=3): >>>import '_signal' # <<< 34139 1726867642.80095: stdout chunk (state=3): >>>import '_abc' # <<< 34139 1726867642.80163: stdout chunk (state=3): >>>import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 34139 1726867642.80239: stdout chunk (state=3): >>>import '_collections_abc' # <<< 34139 1726867642.80265: stdout chunk (state=3): >>>import 'genericpath' # <<< 34139 1726867642.80383: stdout chunk (state=3): >>>import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 34139 1726867642.80501: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382571d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382571dfa0> import 'site' # <<< 34139 1726867642.80534: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34139 1726867642.80901: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 34139 1726867642.81019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 34139 1726867642.81026: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 34139 1726867642.81028: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 34139 1726867642.81043: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 34139 1726867642.81056: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382575bec0> <<< 34139 1726867642.81074: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 34139 1726867642.81139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382575bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 34139 1726867642.81162: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 34139 1726867642.81252: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 34139 1726867642.81283: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 34139 1726867642.81290: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825793830> <<< 34139 1726867642.81305: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 34139 1726867642.81361: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825793ec0> import '_collections' # <<< 34139 1726867642.81379: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825773b60> <<< 34139 1726867642.81391: stdout chunk (state=3): >>>import '_functools' # <<< 34139 1726867642.81466: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257712b0> <<< 34139 1726867642.81504: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825759070> <<< 34139 1726867642.81531: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 34139 1726867642.81545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 34139 1726867642.81562: stdout chunk (state=3): >>>import '_sre' # <<< 34139 1726867642.81714: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 34139 1726867642.81717: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257b37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257b23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 34139 1726867642.81719: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825772150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257b0bc0> <<< 34139 1726867642.81773: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 34139 1726867642.81776: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257e8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257582f0> <<< 34139 1726867642.81847: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38257e8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257e8bf0> <<< 34139 1726867642.81882: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867642.81960: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38257e8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825756e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867642.82216: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257e9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257e9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257ea540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825800740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3825801e20> <<< 34139 1726867642.82222: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 34139 1726867642.82233: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 34139 1726867642.82291: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825802cc0> <<< 34139 1726867642.82320: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867642.82323: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38258032f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825802210> <<< 34139 1726867642.82412: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867642.82415: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3825803d70> <<< 34139 1726867642.82418: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38258034a0> <<< 34139 1726867642.82471: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257ea4b0> <<< 34139 1726867642.82474: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 34139 1726867642.82515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 34139 1726867642.82518: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 34139 1726867642.82544: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 34139 1726867642.82602: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867642.82614: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382550bc50> <<< 34139 1726867642.82619: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 34139 1726867642.82621: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 34139 1726867642.82637: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3825534710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825534470> <<< 34139 1726867642.82657: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3825534740> <<< 34139 1726867642.82691: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 34139 1726867642.82842: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867642.82891: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3825535070> <<< 34139 1726867642.83005: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3825535a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825534920> <<< 34139 1726867642.83025: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825509df0> <<< 34139 1726867642.83064: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 34139 1726867642.83082: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 34139 1726867642.83176: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825536e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825535b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257eac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34139 1726867642.83237: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867642.83257: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 34139 1726867642.83284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 34139 1726867642.83395: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382555f1a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867642.83505: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825583560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 34139 1726867642.83519: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34139 1726867642.83578: stdout chunk (state=3): >>>import 'ntpath' # <<< 34139 1726867642.83599: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 34139 1726867642.83617: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38255e42c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 34139 1726867642.83726: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 34139 1726867642.83796: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38255e6a20> <<< 34139 1726867642.83870: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38255e43e0> <<< 34139 1726867642.83898: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38255a52b0> <<< 34139 1726867642.83944: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824f253d0> <<< 34139 1726867642.83958: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825582360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825537d70> <<< 34139 1726867642.84134: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 34139 1726867642.84155: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3824f25670> <<< 34139 1726867642.84497: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_nt8ujkb0/ansible_setup_payload.zip' # zipimport: zlib available <<< 34139 1726867642.84604: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 34139 1726867642.84610: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34139 1726867642.84680: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 34139 1726867642.84714: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824f8f170> <<< 34139 1726867642.84717: stdout chunk (state=3): >>>import '_typing' # <<< 34139 1726867642.84890: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824f6e060> <<< 34139 1726867642.84926: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824f6d1f0> # zipimport: zlib available import 'ansible' # <<< 34139 1726867642.84993: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 34139 1726867642.86391: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.87560: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 34139 1726867642.87565: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824f8d040> <<< 34139 1726867642.87612: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867642.87889: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824fbeb10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824fbe8a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824fbe1b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824fbe600> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824f8fb90> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824fbf890> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824fbfad0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 34139 1726867642.88020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824fbff50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 34139 1726867642.88054: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e29e20> <<< 34139 1726867642.88084: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867642.88087: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824e2ba40> <<< 34139 1726867642.88107: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 34139 1726867642.88123: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 34139 1726867642.88204: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e2c410> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 34139 1726867642.88234: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 34139 1726867642.88237: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e2d5b0> <<< 34139 1726867642.88239: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 34139 1726867642.88324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 34139 1726867642.88463: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e2ff50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824e34680> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e2e2d0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 34139 1726867642.88466: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 34139 1726867642.88482: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 34139 1726867642.88497: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 34139 1726867642.88907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e37fb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e36a80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e367e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e36d50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e2e7e0> <<< 34139 1726867642.88912: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824e7c1d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e7c380> <<< 34139 1726867642.89013: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 34139 1726867642.89017: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824e7de20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e7dbe0> <<< 34139 1726867642.89020: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34139 1726867642.89054: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 34139 1726867642.89108: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824e803b0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e7e510> <<< 34139 1726867642.89128: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34139 1726867642.89402: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e83b90> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e80560> <<< 34139 1726867642.89438: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867642.89464: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824e84bc0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867642.89480: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824e84d40> <<< 34139 1726867642.89508: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824e84e30> <<< 34139 1726867642.89527: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e7c4d0> <<< 34139 1726867642.89551: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 34139 1726867642.89564: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 34139 1726867642.89588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 34139 1726867642.89609: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867642.89641: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867642.89657: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824d0c530> <<< 34139 1726867642.89873: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824d0d490> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e86cc0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824e87890> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e868d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 34139 1726867642.89895: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.89973: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.90061: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.90102: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 34139 1726867642.90124: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 34139 1726867642.90135: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.90238: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.90359: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.90897: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.91434: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 34139 1726867642.91461: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 34139 1726867642.91584: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 34139 1726867642.91611: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824d15790> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 34139 1726867642.91636: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824d164e0> <<< 34139 1726867642.91655: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824d0d7f0> <<< 34139 1726867642.91716: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 34139 1726867642.91720: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.91746: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 34139 1726867642.91756: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.92003: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.92047: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 34139 1726867642.92071: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824d162a0> # zipimport: zlib available <<< 34139 1726867642.92526: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.92959: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.93028: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.93096: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 34139 1726867642.93116: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.93207: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 34139 1726867642.93256: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.93339: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 34139 1726867642.93381: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 34139 1726867642.93530: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.93544: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 34139 1726867642.93691: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.93917: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 34139 1726867642.93972: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 34139 1726867642.94034: stdout chunk (state=3): >>>import '_ast' # <<< 34139 1726867642.94083: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824d176b0> <<< 34139 1726867642.94111: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.94145: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.94283: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 34139 1726867642.94298: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.94341: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 34139 1726867642.94345: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.94387: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.94431: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.94480: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.94552: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 34139 1726867642.94619: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867642.94671: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824d22150> <<< 34139 1726867642.94725: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824d1d940> <<< 34139 1726867642.94746: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 34139 1726867642.94883: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.94906: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34139 1726867642.94954: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867642.94994: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 34139 1726867642.94998: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 34139 1726867642.95015: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 34139 1726867642.95090: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 34139 1726867642.95094: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 34139 1726867642.95117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 34139 1726867642.95189: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e0aa80> <<< 34139 1726867642.95200: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824fea780> <<< 34139 1726867642.95285: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824d222a0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825583b90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 34139 1726867642.95320: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.95354: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 34139 1726867642.95400: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 34139 1726867642.95427: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 34139 1726867642.95528: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34139 1726867642.95601: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34139 1726867642.95656: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.95659: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.95823: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 34139 1726867642.95842: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.95922: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.95937: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.96022: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 34139 1726867642.96206: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.96318: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.96494: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.96498: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867642.96503: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 34139 1726867642.96531: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824db2390> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 34139 1726867642.96573: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 34139 1726867642.96651: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 34139 1726867642.96674: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824948260> <<< 34139 1726867642.96717: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824948830> <<< 34139 1726867642.96759: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824d981d0> <<< 34139 1726867642.96784: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824db2ed0> <<< 34139 1726867642.96808: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824db0a70> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824db06b0> <<< 34139 1726867642.96833: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 34139 1726867642.96935: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 34139 1726867642.96938: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 34139 1726867642.96940: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 34139 1726867642.96981: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382494b530> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382494ade0> <<< 34139 1726867642.97038: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382494afc0> <<< 34139 1726867642.97049: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382494a240> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 34139 1726867642.97167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382494b6e0> <<< 34139 1726867642.97192: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 34139 1726867642.97214: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 34139 1726867642.97260: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824996210> <<< 34139 1726867642.97301: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824994230> <<< 34139 1726867642.97341: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824db0770> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 34139 1726867642.97346: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.97384: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 34139 1726867642.97432: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.97504: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 34139 1726867642.97507: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.97561: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.97623: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 34139 1726867642.97627: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34139 1726867642.97651: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 34139 1726867642.97682: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.97716: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 34139 1726867642.97772: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.97834: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 34139 1726867642.97837: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.97881: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.97931: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 34139 1726867642.97979: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.98041: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.98097: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.98175: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 34139 1726867642.98181: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 34139 1726867642.98192: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.98694: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.99145: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available <<< 34139 1726867642.99185: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.99216: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.99269: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 34139 1726867642.99294: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.99321: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 34139 1726867642.99346: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.99397: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.99482: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 34139 1726867642.99498: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.99524: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 34139 1726867642.99590: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # <<< 34139 1726867642.99612: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.99683: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867642.99771: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 34139 1726867642.99819: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824996390> <<< 34139 1726867642.99822: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 34139 1726867642.99844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 34139 1726867642.99973: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824996e10> <<< 34139 1726867642.99976: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 34139 1726867642.99993: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.00040: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.00115: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 34139 1726867643.00131: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.00205: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.00296: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 34139 1726867643.00368: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.00437: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 34139 1726867643.00464: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.00488: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.00541: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 34139 1726867643.00585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 34139 1726867643.00659: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867643.00707: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38249d6420> <<< 34139 1726867643.00907: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38249c6180> import 'ansible.module_utils.facts.system.python' # <<< 34139 1726867643.00918: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.00962: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.01022: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 34139 1726867643.01120: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.01199: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.01307: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.01456: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 34139 1726867643.01470: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 34139 1726867643.01500: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.01557: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 34139 1726867643.01603: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.01614: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.01730: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867643.01898: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38249ea0c0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38249ebaa0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 34139 1726867643.01901: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.01937: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 34139 1726867643.01990: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.02137: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 34139 1726867643.02257: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34139 1726867643.02348: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.02524: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.02527: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34139 1726867643.02627: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.02767: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 34139 1726867643.02803: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 34139 1726867643.03025: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.03029: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 34139 1726867643.03070: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.03106: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.03639: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.04139: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 34139 1726867643.04155: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 34139 1726867643.04294: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.04369: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 34139 1726867643.04389: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.04475: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.04582: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 34139 1726867643.04585: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.04736: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.04900: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 34139 1726867643.04934: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 34139 1726867643.04937: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.04979: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.05028: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 34139 1726867643.05123: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.05216: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.05413: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.05634: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 34139 1726867643.05638: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.05684: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.05708: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 34139 1726867643.05747: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.05751: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.05768: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 34139 1726867643.05802: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.05893: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.05917: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 34139 1726867643.05940: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.05970: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 34139 1726867643.05983: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.06034: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.06098: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 34139 1726867643.06110: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.06161: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.06221: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 34139 1726867643.06489: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.06740: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 34139 1726867643.06761: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.06816: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.06878: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 34139 1726867643.06920: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.06958: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 34139 1726867643.06969: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.06996: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.07028: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 34139 1726867643.07069: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.07114: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 34139 1726867643.07118: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.07188: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.07292: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 34139 1726867643.07303: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 34139 1726867643.07326: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.07369: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.07426: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 34139 1726867643.07449: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34139 1726867643.07459: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.07495: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.07539: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.07612: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.07680: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 34139 1726867643.07688: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 34139 1726867643.07704: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.07752: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.07810: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 34139 1726867643.07812: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.08003: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.08210: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 34139 1726867643.08214: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.08260: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.08301: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 34139 1726867643.08355: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.08415: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 34139 1726867643.08418: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.08493: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.08587: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 34139 1726867643.08590: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.08673: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.08763: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 34139 1726867643.08850: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.09496: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 34139 1726867643.09500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 34139 1726867643.09528: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 34139 1726867643.09572: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38247e79e0> <<< 34139 1726867643.09575: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38247e5040> <<< 34139 1726867643.09669: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38247e7500> <<< 34139 1726867643.10340: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 32980 10.31.12.57 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 32980 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "27", "second": "23", "epoch": "1726867643", "epoch_int": "1726867643", "date": "2024-09-20", "time": "17:27:23", "iso8601_micro": "2024-09-20T21:27:23.092871Z", "iso8601": "2024-09-20T21:27:23Z", "iso8601_basic": "20240920T172723092871", "iso8601_basic_short": "20240920T172723", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-57", "ansible_nodename": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec293fb3626e3a20695ae06b45478339", "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7JVDfMeZKYw4NvDf4J6T4eu3duEI1TDN8eY5Ag46A+Ty47bFYfPmW8jVxlz3g+Tlfs7803yjUxR8BhfnXFZj/ShR0Zt/NELUYUVHxS02yzVAX46Y/KQOzI9qRt8tn6zOckZ/+JxKdaH4KujKn7hn6gshq1vw8EYiHTG0Qh6hfm5GPWLD5l6fJeToO5P4jLX8zZS6NMoZR+K0P0F/xOkWEwjI1nJbD4GE/YiqzqLHq6U6rqEJJJWonNID6UzPfdWm+n8LyKoVCKBkDEBVl2RUr8Tsnq4MvYG+29djt/3smMIshPKMV+5fzmOvIUzv2YNfQB8w6aFoUnL8qSaEvV8A/30HdDOfRMCUanxsl1eSz0oMgGgwuQGW+lT1FSzP9U9mEQM92nj5Xgp0vf3oGttMW7RHoOjnkx3T8GVpOuPHnV0/Za7EXFaFun607WeBN2SsoO8UQ5HyKRLlC6ISzWOkWAc0L6v/tAMtxHQG5Bp40E0MHFpDc2SEbbFD+SVTfFQM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV4LdcoMAl+JydFQSAxZ6GfPzd/6UfaeOa/SPTjnrI5J8u4+cAsuyFQSKSblfcVNXleTIvzCZHrC699g4HQaHE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII78+YWuBOZy60GFrh19oZTZhmiNQUWzC28D2cLLUyoq", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34139 1726867643.10890: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 34139 1726867643.10894: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp <<< 34139 1726867643.10925: stdout chunk (state=3): >>># cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types <<< 34139 1726867643.11010: stdout chunk (state=3): >>># cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap <<< 34139 1726867643.11014: stdout chunk (state=3): >>># cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset <<< 34139 1726867643.11017: stdout chunk (state=3): >>># destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc <<< 34139 1726867643.11019: stdout chunk (state=3): >>># cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale <<< 34139 1726867643.11053: stdout chunk (state=3): >>># cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text <<< 34139 1726867643.11075: stdout chunk (state=3): >>># cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 34139 1726867643.11081: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing <<< 34139 1726867643.11125: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi <<< 34139 1726867643.11174: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace <<< 34139 1726867643.11181: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 34139 1726867643.11501: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 34139 1726867643.11562: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma <<< 34139 1726867643.11576: stdout chunk (state=3): >>># destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 34139 1726867643.11632: stdout chunk (state=3): >>># destroy ntpath <<< 34139 1726867643.11638: stdout chunk (state=3): >>># destroy importlib # destroy zipimport <<< 34139 1726867643.11690: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal <<< 34139 1726867643.11703: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid <<< 34139 1726867643.11746: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 34139 1726867643.11757: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 34139 1726867643.11805: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 34139 1726867643.11847: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 34139 1726867643.11887: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime <<< 34139 1726867643.11934: stdout chunk (state=3): >>># destroy subprocess # destroy base64 # destroy _ssl <<< 34139 1726867643.11957: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 34139 1726867643.11996: stdout chunk (state=3): >>># destroy errno # destroy json # destroy socket # destroy struct # destroy glob <<< 34139 1726867643.12011: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 34139 1726867643.12065: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 34139 1726867643.12115: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 34139 1726867643.12125: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 34139 1726867643.12212: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 <<< 34139 1726867643.12216: stdout chunk (state=3): >>># cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 34139 1726867643.12229: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34139 1726867643.12372: stdout chunk (state=3): >>># destroy sys.monitoring <<< 34139 1726867643.12398: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 34139 1726867643.12436: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 34139 1726867643.12466: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 34139 1726867643.12488: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 34139 1726867643.12521: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34139 1726867643.12626: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs <<< 34139 1726867643.12650: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 34139 1726867643.12681: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 34139 1726867643.12722: stdout chunk (state=3): >>># destroy itertools <<< 34139 1726867643.12735: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 34139 1726867643.13140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 34139 1726867643.13144: stdout chunk (state=3): >>><<< 34139 1726867643.13146: stderr chunk (state=3): >>><<< 34139 1726867643.13360: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38259684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825937b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382596aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382571d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382571dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382575bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382575bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825793830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825793ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825773b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257712b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825759070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257b37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257b23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825772150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257b0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257e8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257582f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38257e8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257e8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38257e8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825756e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257e9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257e9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257ea540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825800740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3825801e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825802cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38258032f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825802210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3825803d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38258034a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257ea4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382550bc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3825534710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825534470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3825534740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3825535070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3825535a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825534920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825509df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825536e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825535b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38257eac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382555f1a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825583560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38255e42c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38255e6a20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38255e43e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38255a52b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824f253d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825582360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825537d70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3824f25670> # zipimport: found 103 names in '/tmp/ansible_setup_payload_nt8ujkb0/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824f8f170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824f6e060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824f6d1f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824f8d040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824fbeb10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824fbe8a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824fbe1b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824fbe600> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824f8fb90> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824fbf890> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824fbfad0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824fbff50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e29e20> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824e2ba40> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e2c410> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e2d5b0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e2ff50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824e34680> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e2e2d0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e37fb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e36a80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e367e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e36d50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e2e7e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824e7c1d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e7c380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824e7de20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e7dbe0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824e803b0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e7e510> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e83b90> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e80560> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824e84bc0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824e84d40> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824e84e30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e7c4d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824d0c530> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824d0d490> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e86cc0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824e87890> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e868d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824d15790> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824d164e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824d0d7f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824d162a0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824d176b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824d22150> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824d1d940> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824e0aa80> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824fea780> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824d222a0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3825583b90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824db2390> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824948260> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824948830> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824d981d0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824db2ed0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824db0a70> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824db06b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382494b530> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382494ade0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f382494afc0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382494a240> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f382494b6e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3824996210> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824994230> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824db0770> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824996390> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3824996e10> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38249d6420> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38249c6180> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38249ea0c0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38249ebaa0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38247e79e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38247e5040> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38247e7500> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 32980 10.31.12.57 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 32980 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "27", "second": "23", "epoch": "1726867643", "epoch_int": "1726867643", "date": "2024-09-20", "time": "17:27:23", "iso8601_micro": "2024-09-20T21:27:23.092871Z", "iso8601": "2024-09-20T21:27:23Z", "iso8601_basic": "20240920T172723092871", "iso8601_basic_short": "20240920T172723", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-57", "ansible_nodename": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec293fb3626e3a20695ae06b45478339", "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7JVDfMeZKYw4NvDf4J6T4eu3duEI1TDN8eY5Ag46A+Ty47bFYfPmW8jVxlz3g+Tlfs7803yjUxR8BhfnXFZj/ShR0Zt/NELUYUVHxS02yzVAX46Y/KQOzI9qRt8tn6zOckZ/+JxKdaH4KujKn7hn6gshq1vw8EYiHTG0Qh6hfm5GPWLD5l6fJeToO5P4jLX8zZS6NMoZR+K0P0F/xOkWEwjI1nJbD4GE/YiqzqLHq6U6rqEJJJWonNID6UzPfdWm+n8LyKoVCKBkDEBVl2RUr8Tsnq4MvYG+29djt/3smMIshPKMV+5fzmOvIUzv2YNfQB8w6aFoUnL8qSaEvV8A/30HdDOfRMCUanxsl1eSz0oMgGgwuQGW+lT1FSzP9U9mEQM92nj5Xgp0vf3oGttMW7RHoOjnkx3T8GVpOuPHnV0/Za7EXFaFun607WeBN2SsoO8UQ5HyKRLlC6ISzWOkWAc0L6v/tAMtxHQG5Bp40E0MHFpDc2SEbbFD+SVTfFQM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV4LdcoMAl+JydFQSAxZ6GfPzd/6UfaeOa/SPTjnrI5J8u4+cAsuyFQSKSblfcVNXleTIvzCZHrC699g4HQaHE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII78+YWuBOZy60GFrh19oZTZhmiNQUWzC28D2cLLUyoq", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 34139 1726867643.15132: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867642.635362-34224-172474604501522/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34139 1726867643.15136: _low_level_execute_command(): starting 34139 1726867643.15138: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867642.635362-34224-172474604501522/ > /dev/null 2>&1 && sleep 0' 34139 1726867643.15141: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867643.15185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 34139 1726867643.15204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34139 1726867643.15244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867643.15316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867643.17173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34139 1726867643.17179: stdout chunk (state=3): >>><<< 34139 1726867643.17182: stderr chunk (state=3): >>><<< 34139 1726867643.17389: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34139 1726867643.17392: handler run complete 34139 1726867643.17394: variable 'ansible_facts' from source: unknown 34139 1726867643.17396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.17443: variable 'ansible_facts' from source: unknown 34139 1726867643.17504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.17565: attempt loop complete, returning result 34139 1726867643.17573: _execute() done 34139 1726867643.17583: dumping result to json 34139 1726867643.17610: done dumping result, returning 34139 1726867643.17623: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcac9-a3a5-c103-b8fd-000000000158] 34139 1726867643.17630: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000158 ok: [managed_node1] 34139 1726867643.17906: no more pending results, returning what we have 34139 1726867643.17912: results queue empty 34139 1726867643.17913: checking for any_errors_fatal 34139 1726867643.17915: done checking for any_errors_fatal 34139 1726867643.17916: checking for max_fail_percentage 34139 1726867643.17917: done checking for max_fail_percentage 34139 1726867643.17918: checking to see if all hosts have failed and the running result is not ok 34139 1726867643.17919: done checking to see if all hosts have failed 34139 1726867643.17920: getting the remaining hosts for this loop 34139 1726867643.17921: done getting the remaining hosts for this loop 34139 1726867643.17925: getting the next task for host managed_node1 34139 1726867643.17934: done getting next task for host managed_node1 34139 1726867643.17937: ^ task is: TASK: Check if system is ostree 34139 1726867643.17940: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867643.17943: getting variables 34139 1726867643.17945: in VariableManager get_vars() 34139 1726867643.17975: Calling all_inventory to load vars for managed_node1 34139 1726867643.18084: Calling groups_inventory to load vars for managed_node1 34139 1726867643.18089: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.18100: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.18102: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.18106: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.18546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.18863: done with get_vars() 34139 1726867643.18875: done getting variables 34139 1726867643.18915: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000158 34139 1726867643.18918: WORKER PROCESS EXITING TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 17:27:23 -0400 (0:00:00.632) 0:00:01.936 ****** 34139 1726867643.18999: entering _queue_task() for managed_node1/stat 34139 1726867643.19353: worker is 1 (out of 1 available) 34139 1726867643.19366: exiting _queue_task() for managed_node1/stat 34139 1726867643.19380: done queuing things up, now waiting for results queue to drain 34139 1726867643.19382: waiting for pending results... 34139 1726867643.19624: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 34139 1726867643.19743: in run() - task 0affcac9-a3a5-c103-b8fd-00000000015a 34139 1726867643.19770: variable 'ansible_search_path' from source: unknown 34139 1726867643.19851: variable 'ansible_search_path' from source: unknown 34139 1726867643.19854: calling self._execute() 34139 1726867643.19959: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867643.19962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867643.19972: variable 'omit' from source: magic vars 34139 1726867643.20449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34139 1726867643.20741: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34139 1726867643.20791: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34139 1726867643.20844: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34139 1726867643.20892: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34139 1726867643.20998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34139 1726867643.21047: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34139 1726867643.21075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34139 1726867643.21156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34139 1726867643.21257: Evaluated conditional (not __network_is_ostree is defined): True 34139 1726867643.21279: variable 'omit' from source: magic vars 34139 1726867643.21374: variable 'omit' from source: magic vars 34139 1726867643.21379: variable 'omit' from source: magic vars 34139 1726867643.21397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34139 1726867643.21484: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34139 1726867643.21487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34139 1726867643.21583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34139 1726867643.21587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34139 1726867643.21589: variable 'inventory_hostname' from source: host vars for 'managed_node1' 34139 1726867643.21592: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867643.21593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867643.21679: Set connection var ansible_timeout to 10 34139 1726867643.21692: Set connection var ansible_shell_type to sh 34139 1726867643.21712: Set connection var ansible_shell_executable to /bin/sh 34139 1726867643.21728: Set connection var ansible_pipelining to False 34139 1726867643.21737: Set connection var ansible_connection to ssh 34139 1726867643.21746: Set connection var ansible_module_compression to ZIP_DEFLATED 34139 1726867643.21774: variable 'ansible_shell_executable' from source: unknown 34139 1726867643.21784: variable 'ansible_connection' from source: unknown 34139 1726867643.21818: variable 'ansible_module_compression' from source: unknown 34139 1726867643.21823: variable 'ansible_shell_type' from source: unknown 34139 1726867643.21827: variable 'ansible_shell_executable' from source: unknown 34139 1726867643.21829: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867643.21831: variable 'ansible_pipelining' from source: unknown 34139 1726867643.21833: variable 'ansible_timeout' from source: unknown 34139 1726867643.21835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867643.21979: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34139 1726867643.22037: variable 'omit' from source: magic vars 34139 1726867643.22041: starting attempt loop 34139 1726867643.22044: running the handler 34139 1726867643.22046: _low_level_execute_command(): starting 34139 1726867643.22048: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34139 1726867643.22752: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34139 1726867643.22768: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34139 1726867643.22788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34139 1726867643.22807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34139 1726867643.22913: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 34139 1726867643.22938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867643.23031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867643.24647: stdout chunk (state=3): >>>/root <<< 34139 1726867643.24802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34139 1726867643.24805: stdout chunk (state=3): >>><<< 34139 1726867643.24807: stderr chunk (state=3): >>><<< 34139 1726867643.24915: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34139 1726867643.24928: _low_level_execute_command(): starting 34139 1726867643.24932: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867643.2483273-34248-213245628190252 `" && echo ansible-tmp-1726867643.2483273-34248-213245628190252="` echo /root/.ansible/tmp/ansible-tmp-1726867643.2483273-34248-213245628190252 `" ) && sleep 0' 34139 1726867643.25548: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 34139 1726867643.25568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867643.25706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867643.27643: stdout chunk (state=3): >>>ansible-tmp-1726867643.2483273-34248-213245628190252=/root/.ansible/tmp/ansible-tmp-1726867643.2483273-34248-213245628190252 <<< 34139 1726867643.27823: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34139 1726867643.27827: stdout chunk (state=3): >>><<< 34139 1726867643.27829: stderr chunk (state=3): >>><<< 34139 1726867643.27990: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867643.2483273-34248-213245628190252=/root/.ansible/tmp/ansible-tmp-1726867643.2483273-34248-213245628190252 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34139 1726867643.27993: variable 'ansible_module_compression' from source: unknown 34139 1726867643.27995: ANSIBALLZ: Using lock for stat 34139 1726867643.27997: ANSIBALLZ: Acquiring lock 34139 1726867643.27999: ANSIBALLZ: Lock acquired: 140192904008464 34139 1726867643.28001: ANSIBALLZ: Creating module 34139 1726867643.37658: ANSIBALLZ: Writing module into payload 34139 1726867643.37724: ANSIBALLZ: Writing module 34139 1726867643.37740: ANSIBALLZ: Renaming module 34139 1726867643.37744: ANSIBALLZ: Done creating module 34139 1726867643.37760: variable 'ansible_facts' from source: unknown 34139 1726867643.37807: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867643.2483273-34248-213245628190252/AnsiballZ_stat.py 34139 1726867643.37906: Sending initial data 34139 1726867643.37912: Sent initial data (153 bytes) 34139 1726867643.38373: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34139 1726867643.38376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867643.38381: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34139 1726867643.38383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 34139 1726867643.38385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867643.38439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 34139 1726867643.38442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34139 1726867643.38444: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867643.38502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867643.40135: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 34139 1726867643.40139: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34139 1726867643.40180: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34139 1726867643.40231: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34139vobchn_u/tmpcxm5ezde /root/.ansible/tmp/ansible-tmp-1726867643.2483273-34248-213245628190252/AnsiballZ_stat.py <<< 34139 1726867643.40234: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867643.2483273-34248-213245628190252/AnsiballZ_stat.py" <<< 34139 1726867643.40269: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34139vobchn_u/tmpcxm5ezde" to remote "/root/.ansible/tmp/ansible-tmp-1726867643.2483273-34248-213245628190252/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867643.2483273-34248-213245628190252/AnsiballZ_stat.py" <<< 34139 1726867643.40810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34139 1726867643.40850: stderr chunk (state=3): >>><<< 34139 1726867643.40854: stdout chunk (state=3): >>><<< 34139 1726867643.40884: done transferring module to remote 34139 1726867643.40893: _low_level_execute_command(): starting 34139 1726867643.40897: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867643.2483273-34248-213245628190252/ /root/.ansible/tmp/ansible-tmp-1726867643.2483273-34248-213245628190252/AnsiballZ_stat.py && sleep 0' 34139 1726867643.41321: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34139 1726867643.41325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 34139 1726867643.41327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867643.41329: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867643.41379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 34139 1726867643.41383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867643.41433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867643.43184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34139 1726867643.43205: stderr chunk (state=3): >>><<< 34139 1726867643.43210: stdout chunk (state=3): >>><<< 34139 1726867643.43223: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34139 1726867643.43226: _low_level_execute_command(): starting 34139 1726867643.43231: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867643.2483273-34248-213245628190252/AnsiballZ_stat.py && sleep 0' 34139 1726867643.43676: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34139 1726867643.43681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867643.43683: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 34139 1726867643.43685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34139 1726867643.43687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867643.43742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 34139 1726867643.43746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34139 1726867643.43750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867643.43814: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867643.45934: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34139 1726867643.45983: stdout chunk (state=3): >>>import _imp # builtin <<< 34139 1726867643.45999: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 34139 1726867643.46091: stdout chunk (state=3): >>>import '_io' # <<< 34139 1726867643.46094: stdout chunk (state=3): >>>import 'marshal' # <<< 34139 1726867643.46137: stdout chunk (state=3): >>>import 'posix' # <<< 34139 1726867643.46142: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 34139 1726867643.46181: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 34139 1726867643.46222: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867643.46238: stdout chunk (state=3): >>>import '_codecs' # <<< 34139 1726867643.46263: stdout chunk (state=3): >>>import 'codecs' # <<< 34139 1726867643.46313: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 34139 1726867643.46332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d86184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d85e7b30> <<< 34139 1726867643.46486: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d861aa50> <<< 34139 1726867643.46489: stdout chunk (state=3): >>>import '_signal' # <<< 34139 1726867643.46491: stdout chunk (state=3): >>>import '_abc' # <<< 34139 1726867643.46494: stdout chunk (state=3): >>>import 'abc' # <<< 34139 1726867643.46495: stdout chunk (state=3): >>>import 'io' # <<< 34139 1726867643.46611: stdout chunk (state=3): >>>import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # <<< 34139 1726867643.46664: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 34139 1726867643.46683: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 34139 1726867643.46702: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 34139 1726867643.46736: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8409130> <<< 34139 1726867643.46787: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 34139 1726867643.46812: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8409fa0> <<< 34139 1726867643.46842: stdout chunk (state=3): >>>import 'site' # <<< 34139 1726867643.46869: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34139 1726867643.47095: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 34139 1726867643.47114: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 34139 1726867643.47156: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 34139 1726867643.47200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 34139 1726867643.47235: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 34139 1726867643.47239: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 34139 1726867643.47296: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8447e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 34139 1726867643.47312: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8447f50> <<< 34139 1726867643.47335: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 34139 1726867643.47356: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 34139 1726867643.47391: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 34139 1726867643.47431: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867643.47481: stdout chunk (state=3): >>>import 'itertools' # <<< 34139 1726867643.47529: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d847f890> <<< 34139 1726867643.47533: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d847ff20> <<< 34139 1726867643.47546: stdout chunk (state=3): >>>import '_collections' # <<< 34139 1726867643.47588: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d845fb60> import '_functools' # <<< 34139 1726867643.47621: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d845d280> <<< 34139 1726867643.47724: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8445040> <<< 34139 1726867643.47787: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 34139 1726867643.47926: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d849f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d849e420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d845e150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d849cc80> <<< 34139 1726867643.47985: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84d4890> <<< 34139 1726867643.48013: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84442c0> <<< 34139 1726867643.48055: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 34139 1726867643.48100: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d84d4d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84d4bf0> <<< 34139 1726867643.48105: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867643.48136: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d84d4fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8442de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867643.48191: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 34139 1726867643.48219: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 34139 1726867643.48271: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84d56d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84d53a0> import 'importlib.machinery' # <<< 34139 1726867643.48276: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 34139 1726867643.48315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84d65d0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 34139 1726867643.48363: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 34139 1726867643.48406: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84ec7a0> import 'errno' # <<< 34139 1726867643.48450: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d84edeb0> <<< 34139 1726867643.48454: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 34139 1726867643.48520: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 34139 1726867643.48524: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84eed50> <<< 34139 1726867643.48571: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d84ef380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84ee2a0> <<< 34139 1726867643.48574: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 34139 1726867643.48635: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 34139 1726867643.48639: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d84efe00> <<< 34139 1726867643.48658: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84ef530> <<< 34139 1726867643.48693: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84d6570> <<< 34139 1726867643.48705: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 34139 1726867643.48758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 34139 1726867643.48761: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 34139 1726867643.48825: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d8277ce0> <<< 34139 1726867643.48829: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 34139 1726867643.48882: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d82a0740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d82a04a0> <<< 34139 1726867643.48913: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d82a0770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 34139 1726867643.48929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 34139 1726867643.48994: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867643.49132: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d82a10a0> <<< 34139 1726867643.49235: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d82a1a60> <<< 34139 1726867643.49266: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d82a0950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8275e80> <<< 34139 1726867643.49272: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 34139 1726867643.49285: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 34139 1726867643.49313: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 34139 1726867643.49319: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 34139 1726867643.49332: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d82a2e10> <<< 34139 1726867643.49354: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d82a18e0> <<< 34139 1726867643.49376: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84d6cc0> <<< 34139 1726867643.49395: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34139 1726867643.49458: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867643.49473: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 34139 1726867643.49507: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 34139 1726867643.49532: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d82cb170> <<< 34139 1726867643.49596: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 34139 1726867643.49599: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867643.49628: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 34139 1726867643.49637: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 34139 1726867643.49685: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d82ef4d0> <<< 34139 1726867643.49701: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 34139 1726867643.49746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34139 1726867643.49794: stdout chunk (state=3): >>>import 'ntpath' # <<< 34139 1726867643.49822: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d83502f0> <<< 34139 1726867643.49839: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 34139 1726867643.49867: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 34139 1726867643.49895: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 34139 1726867643.49930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 34139 1726867643.50016: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8352a20> <<< 34139 1726867643.50090: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d83503e0> <<< 34139 1726867643.50127: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d83152e0> <<< 34139 1726867643.50153: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 34139 1726867643.50161: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d81553d0> <<< 34139 1726867643.50176: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d82ee300> <<< 34139 1726867643.50182: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d82a3d40> <<< 34139 1726867643.50285: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 34139 1726867643.50305: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f33d82ee660> <<< 34139 1726867643.50446: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_dkd3y2p8/ansible_stat_payload.zip' <<< 34139 1726867643.50453: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.50579: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.50605: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 34139 1726867643.50612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 34139 1726867643.50661: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34139 1726867643.50727: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 34139 1726867643.50765: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d81ab0e0> <<< 34139 1726867643.50771: stdout chunk (state=3): >>>import '_typing' # <<< 34139 1726867643.50954: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8189fd0> <<< 34139 1726867643.50957: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8189160> # zipimport: zlib available <<< 34139 1726867643.50999: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 34139 1726867643.51025: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.51031: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.51048: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 34139 1726867643.51060: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.52450: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.53580: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 34139 1726867643.53591: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d81a9760> <<< 34139 1726867643.53608: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867643.53638: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 34139 1726867643.53642: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 34139 1726867643.53670: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 34139 1726867643.53707: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867643.53714: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d81d2ab0> <<< 34139 1726867643.53747: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d81d2870> <<< 34139 1726867643.53785: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d81d2180> <<< 34139 1726867643.53802: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 34139 1726867643.53810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 34139 1726867643.53846: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d81d2660> <<< 34139 1726867643.53854: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d861a9c0> <<< 34139 1726867643.53859: stdout chunk (state=3): >>>import 'atexit' # <<< 34139 1726867643.53888: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d81d3830> <<< 34139 1726867643.53922: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867643.53927: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d81d39e0> <<< 34139 1726867643.53942: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 34139 1726867643.53994: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 34139 1726867643.53998: stdout chunk (state=3): >>>import '_locale' # <<< 34139 1726867643.54050: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d81d3f20> <<< 34139 1726867643.54053: stdout chunk (state=3): >>>import 'pwd' # <<< 34139 1726867643.54080: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 34139 1726867643.54099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 34139 1726867643.54138: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b11cd0> <<< 34139 1726867643.54167: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867643.54175: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7b138f0> <<< 34139 1726867643.54190: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 34139 1726867643.54207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 34139 1726867643.54242: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b142f0> <<< 34139 1726867643.54261: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 34139 1726867643.54291: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 34139 1726867643.54312: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b15490> <<< 34139 1726867643.54329: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 34139 1726867643.54366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 34139 1726867643.54385: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 34139 1726867643.54390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 34139 1726867643.54441: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b17ef0> <<< 34139 1726867643.54479: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d8442ed0> <<< 34139 1726867643.54500: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b161e0> <<< 34139 1726867643.54520: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 34139 1726867643.54555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 34139 1726867643.54571: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 34139 1726867643.54584: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 34139 1726867643.54591: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 34139 1726867643.54624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 34139 1726867643.54649: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 34139 1726867643.54652: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b1fdd0> <<< 34139 1726867643.54673: stdout chunk (state=3): >>>import '_tokenize' # <<< 34139 1726867643.54735: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b1e8d0> <<< 34139 1726867643.54746: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b1e630> <<< 34139 1726867643.54761: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 34139 1726867643.54772: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 34139 1726867643.54838: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b1eba0> <<< 34139 1726867643.54869: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b166c0> <<< 34139 1726867643.54896: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867643.54901: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7b67fb0> <<< 34139 1726867643.54927: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 34139 1726867643.54933: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b68170> <<< 34139 1726867643.54945: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 34139 1726867643.54968: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 34139 1726867643.54987: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 34139 1726867643.54993: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 34139 1726867643.55026: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867643.55032: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7b69bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b69970> <<< 34139 1726867643.55047: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34139 1726867643.55167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 34139 1726867643.55220: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7b6bfe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b6a270> <<< 34139 1726867643.55246: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34139 1726867643.55282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867643.55305: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 34139 1726867643.55319: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 34139 1726867643.55325: stdout chunk (state=3): >>>import '_string' # <<< 34139 1726867643.55366: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b6f7a0> <<< 34139 1726867643.55489: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b6c170> <<< 34139 1726867643.55552: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867643.55558: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7b70a70> <<< 34139 1726867643.55587: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 34139 1726867643.55592: stdout chunk (state=3): >>> import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7b70770> <<< 34139 1726867643.55630: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867643.55637: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7b70a40> <<< 34139 1726867643.55648: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b68290> <<< 34139 1726867643.55669: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 34139 1726867643.55686: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 34139 1726867643.55705: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 34139 1726867643.55735: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867643.55761: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867643.55768: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7bfc1a0> <<< 34139 1726867643.55912: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 34139 1726867643.55915: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7bfd610> <<< 34139 1726867643.55928: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b72930> <<< 34139 1726867643.55963: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7b73ce0> <<< 34139 1726867643.55966: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b72570> <<< 34139 1726867643.55983: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.56000: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 34139 1726867643.56016: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.56100: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.56205: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.56208: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.56211: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 34139 1726867643.56230: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.56234: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 34139 1726867643.56254: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.56370: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.56488: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.57027: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.57565: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 34139 1726867643.57571: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 34139 1726867643.57600: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 34139 1726867643.57612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867643.57665: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7a017c0> <<< 34139 1726867643.57741: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 34139 1726867643.57751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 34139 1726867643.57770: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7a02540> <<< 34139 1726867643.57778: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7bfd730> <<< 34139 1726867643.57820: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 34139 1726867643.57833: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.57850: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.57872: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 34139 1726867643.57879: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.58024: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.58181: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 34139 1726867643.58190: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7a02240> <<< 34139 1726867643.58204: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.58658: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.59102: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.59171: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.59248: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 34139 1726867643.59254: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.59295: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.59326: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 34139 1726867643.59341: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.59403: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.59489: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 34139 1726867643.59494: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.59519: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 34139 1726867643.59533: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.59573: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.59612: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 34139 1726867643.59619: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.59850: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.60081: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 34139 1726867643.60140: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 34139 1726867643.60150: stdout chunk (state=3): >>>import '_ast' # <<< 34139 1726867643.60223: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7a037a0> <<< 34139 1726867643.60226: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.60305: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.60379: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 34139 1726867643.60390: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 34139 1726867643.60401: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 34139 1726867643.60416: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.60458: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.60493: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 34139 1726867643.60508: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.60548: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.60595: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.60648: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.60716: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 34139 1726867643.60754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867643.60842: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7a0e060> <<< 34139 1726867643.60886: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7a0b050> <<< 34139 1726867643.60914: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 34139 1726867643.60917: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 34139 1726867643.60930: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.60994: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.61053: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.61086: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.61122: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 34139 1726867643.61127: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 34139 1726867643.61144: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 34139 1726867643.61176: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 34139 1726867643.61186: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 34139 1726867643.61248: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc'<<< 34139 1726867643.61254: stdout chunk (state=3): >>> <<< 34139 1726867643.61286: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 34139 1726867643.61289: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 34139 1726867643.61352: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8226930> <<< 34139 1726867643.61394: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d820e600> <<< 34139 1726867643.61478: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7a0e120> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7a03080> # destroy ansible.module_utils.distro <<< 34139 1726867643.61488: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # <<< 34139 1726867643.61491: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.61526: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.61548: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 34139 1726867643.61553: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 34139 1726867643.61611: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 34139 1726867643.61623: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.61634: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.61645: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 34139 1726867643.61653: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.61784: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.61974: stdout chunk (state=3): >>># zipimport: zlib available <<< 34139 1726867643.62096: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 34139 1726867643.62433: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2<<< 34139 1726867643.62437: stdout chunk (state=3): >>> # clear sys.last_exc # clear sys.last_type <<< 34139 1726867643.62455: stdout chunk (state=3): >>># clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref <<< 34139 1726867643.62479: stdout chunk (state=3): >>># cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ <<< 34139 1726867643.62496: stdout chunk (state=3): >>># cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser <<< 34139 1726867643.62500: stdout chunk (state=3): >>># cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch <<< 34139 1726867643.62524: stdout chunk (state=3): >>># cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile <<< 34139 1726867643.62538: stdout chunk (state=3): >>># cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils <<< 34139 1726867643.62549: stdout chunk (state=3): >>># cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime <<< 34139 1726867643.62559: stdout chunk (state=3): >>># cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader<<< 34139 1726867643.62575: stdout chunk (state=3): >>> # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes <<< 34139 1726867643.62599: stdout chunk (state=3): >>># destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 34139 1726867643.62842: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 34139 1726867643.62854: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 34139 1726867643.62867: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 34139 1726867643.62889: stdout chunk (state=3): >>># destroy _blake2 <<< 34139 1726867643.62901: stdout chunk (state=3): >>># destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 34139 1726867643.62913: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 34139 1726867643.62933: stdout chunk (state=3): >>># destroy ntpath <<< 34139 1726867643.62962: stdout chunk (state=3): >>># destroy importlib <<< 34139 1726867643.62989: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 34139 1726867643.62994: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale <<< 34139 1726867643.63012: stdout chunk (state=3): >>># destroy pwd # destroy locale # destroy signal # destroy fcntl <<< 34139 1726867643.63015: stdout chunk (state=3): >>># destroy select # destroy _signal # destroy _posixsubprocess <<< 34139 1726867643.63031: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selectors # destroy errno <<< 34139 1726867643.63063: stdout chunk (state=3): >>># destroy array # destroy datetime # destroy selinux <<< 34139 1726867643.63068: stdout chunk (state=3): >>># destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 34139 1726867643.63120: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket<<< 34139 1726867643.63140: stdout chunk (state=3): >>> # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 34139 1726867643.63193: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib <<< 34139 1726867643.63199: stdout chunk (state=3): >>># cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 34139 1726867643.63202: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 34139 1726867643.63206: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 34139 1726867643.63227: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 34139 1726867643.63245: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 34139 1726867643.63266: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings <<< 34139 1726867643.63278: stdout chunk (state=3): >>># cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 34139 1726867643.63290: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128<<< 34139 1726867643.63297: stdout chunk (state=3): >>> # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34139 1726867643.63413: stdout chunk (state=3): >>># destroy sys.monitoring <<< 34139 1726867643.63421: stdout chunk (state=3): >>># destroy _socket <<< 34139 1726867643.63425: stdout chunk (state=3): >>># destroy _collections <<< 34139 1726867643.63455: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 34139 1726867643.63460: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 34139 1726867643.63487: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 34139 1726867643.63518: stdout chunk (state=3): >>># destroy _typing <<< 34139 1726867643.63523: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response <<< 34139 1726867643.63542: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 34139 1726867643.63558: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib<<< 34139 1726867643.63563: stdout chunk (state=3): >>> <<< 34139 1726867643.63648: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 34139 1726867643.63652: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 34139 1726867643.63662: stdout chunk (state=3): >>># destroy time <<< 34139 1726867643.63694: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 34139 1726867643.63709: stdout chunk (state=3): >>># destroy _hashlib <<< 34139 1726867643.63719: stdout chunk (state=3): >>># destroy _operator # destroy _string # destroy re <<< 34139 1726867643.63727: stdout chunk (state=3): >>># destroy itertools <<< 34139 1726867643.63753: stdout chunk (state=3): >>># destroy _abc <<< 34139 1726867643.63774: stdout chunk (state=3): >>># destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 34139 1726867643.64284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 34139 1726867643.64287: stdout chunk (state=3): >>><<< 34139 1726867643.64290: stderr chunk (state=3): >>><<< 34139 1726867643.64307: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d86184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d85e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d861aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8409130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8409fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8447e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8447f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d847f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d847ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d845fb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d845d280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8445040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d849f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d849e420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d845e150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d849cc80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84d4890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84442c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d84d4d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84d4bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d84d4fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8442de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84d56d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84d53a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84d65d0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84ec7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d84edeb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84eed50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d84ef380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84ee2a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d84efe00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84ef530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84d6570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d8277ce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d82a0740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d82a04a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d82a0770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d82a10a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d82a1a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d82a0950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8275e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d82a2e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d82a18e0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d84d6cc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d82cb170> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d82ef4d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d83502f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8352a20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d83503e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d83152e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d81553d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d82ee300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d82a3d40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f33d82ee660> # zipimport: found 30 names in '/tmp/ansible_stat_payload_dkd3y2p8/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d81ab0e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8189fd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8189160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d81a9760> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d81d2ab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d81d2870> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d81d2180> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d81d2660> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d861a9c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d81d3830> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d81d39e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d81d3f20> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b11cd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7b138f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b142f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b15490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b17ef0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d8442ed0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b161e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b1fdd0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b1e8d0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b1e630> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b1eba0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b166c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7b67fb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b68170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7b69bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b69970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7b6bfe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b6a270> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b6f7a0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b6c170> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7b70a70> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7b70770> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7b70a40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b68290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7bfc1a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7bfd610> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b72930> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7b73ce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7b72570> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7a017c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7a02540> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7bfd730> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7a02240> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7a037a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f33d7a0e060> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7a0b050> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d8226930> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d820e600> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7a0e120> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f33d7a03080> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 34139 1726867643.65139: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867643.2483273-34248-213245628190252/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34139 1726867643.65143: _low_level_execute_command(): starting 34139 1726867643.65146: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867643.2483273-34248-213245628190252/ > /dev/null 2>&1 && sleep 0' 34139 1726867643.65151: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34139 1726867643.65153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34139 1726867643.65155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34139 1726867643.65158: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 34139 1726867643.65163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34139 1726867643.65168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 34139 1726867643.65171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34139 1726867643.65174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34139 1726867643.65185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34139 1726867643.67002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34139 1726867643.67024: stderr chunk (state=3): >>><<< 34139 1726867643.67028: stdout chunk (state=3): >>><<< 34139 1726867643.67039: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34139 1726867643.67075: handler run complete 34139 1726867643.67080: attempt loop complete, returning result 34139 1726867643.67282: _execute() done 34139 1726867643.67285: dumping result to json 34139 1726867643.67286: done dumping result, returning 34139 1726867643.67288: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0affcac9-a3a5-c103-b8fd-00000000015a] 34139 1726867643.67290: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000015a 34139 1726867643.67345: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000015a 34139 1726867643.67348: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 34139 1726867643.67407: no more pending results, returning what we have 34139 1726867643.67410: results queue empty 34139 1726867643.67411: checking for any_errors_fatal 34139 1726867643.67417: done checking for any_errors_fatal 34139 1726867643.67418: checking for max_fail_percentage 34139 1726867643.67419: done checking for max_fail_percentage 34139 1726867643.67420: checking to see if all hosts have failed and the running result is not ok 34139 1726867643.67421: done checking to see if all hosts have failed 34139 1726867643.67421: getting the remaining hosts for this loop 34139 1726867643.67423: done getting the remaining hosts for this loop 34139 1726867643.67427: getting the next task for host managed_node1 34139 1726867643.67433: done getting next task for host managed_node1 34139 1726867643.67436: ^ task is: TASK: Set flag to indicate system is ostree 34139 1726867643.67438: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867643.67442: getting variables 34139 1726867643.67444: in VariableManager get_vars() 34139 1726867643.67473: Calling all_inventory to load vars for managed_node1 34139 1726867643.67476: Calling groups_inventory to load vars for managed_node1 34139 1726867643.67482: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.67492: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.67494: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.67497: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.67879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.68088: done with get_vars() 34139 1726867643.68099: done getting variables 34139 1726867643.68191: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 17:27:23 -0400 (0:00:00.492) 0:00:02.428 ****** 34139 1726867643.68222: entering _queue_task() for managed_node1/set_fact 34139 1726867643.68223: Creating lock for set_fact 34139 1726867643.68682: worker is 1 (out of 1 available) 34139 1726867643.68691: exiting _queue_task() for managed_node1/set_fact 34139 1726867643.68700: done queuing things up, now waiting for results queue to drain 34139 1726867643.68701: waiting for pending results... 34139 1726867643.68760: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 34139 1726867643.68881: in run() - task 0affcac9-a3a5-c103-b8fd-00000000015b 34139 1726867643.68885: variable 'ansible_search_path' from source: unknown 34139 1726867643.68888: variable 'ansible_search_path' from source: unknown 34139 1726867643.68916: calling self._execute() 34139 1726867643.68984: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867643.68988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867643.68997: variable 'omit' from source: magic vars 34139 1726867643.69532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34139 1726867643.69773: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34139 1726867643.69818: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34139 1726867643.69852: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34139 1726867643.69884: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34139 1726867643.69966: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34139 1726867643.69994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34139 1726867643.70023: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34139 1726867643.70047: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34139 1726867643.70182: Evaluated conditional (not __network_is_ostree is defined): True 34139 1726867643.70186: variable 'omit' from source: magic vars 34139 1726867643.70205: variable 'omit' from source: magic vars 34139 1726867643.70383: variable '__ostree_booted_stat' from source: set_fact 34139 1726867643.70386: variable 'omit' from source: magic vars 34139 1726867643.70389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34139 1726867643.70421: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34139 1726867643.70439: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34139 1726867643.70456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34139 1726867643.70466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34139 1726867643.70495: variable 'inventory_hostname' from source: host vars for 'managed_node1' 34139 1726867643.70498: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867643.70505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867643.70597: Set connection var ansible_timeout to 10 34139 1726867643.70602: Set connection var ansible_shell_type to sh 34139 1726867643.70615: Set connection var ansible_shell_executable to /bin/sh 34139 1726867643.70617: Set connection var ansible_pipelining to False 34139 1726867643.70620: Set connection var ansible_connection to ssh 34139 1726867643.70725: Set connection var ansible_module_compression to ZIP_DEFLATED 34139 1726867643.70728: variable 'ansible_shell_executable' from source: unknown 34139 1726867643.70731: variable 'ansible_connection' from source: unknown 34139 1726867643.70733: variable 'ansible_module_compression' from source: unknown 34139 1726867643.70735: variable 'ansible_shell_type' from source: unknown 34139 1726867643.70738: variable 'ansible_shell_executable' from source: unknown 34139 1726867643.70741: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867643.70744: variable 'ansible_pipelining' from source: unknown 34139 1726867643.70746: variable 'ansible_timeout' from source: unknown 34139 1726867643.70749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867643.70776: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34139 1726867643.70787: variable 'omit' from source: magic vars 34139 1726867643.70793: starting attempt loop 34139 1726867643.70796: running the handler 34139 1726867643.70806: handler run complete 34139 1726867643.70818: attempt loop complete, returning result 34139 1726867643.70821: _execute() done 34139 1726867643.70824: dumping result to json 34139 1726867643.70830: done dumping result, returning 34139 1726867643.70833: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0affcac9-a3a5-c103-b8fd-00000000015b] 34139 1726867643.70836: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000015b 34139 1726867643.70916: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000015b 34139 1726867643.70919: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 34139 1726867643.70968: no more pending results, returning what we have 34139 1726867643.70971: results queue empty 34139 1726867643.70972: checking for any_errors_fatal 34139 1726867643.70980: done checking for any_errors_fatal 34139 1726867643.70981: checking for max_fail_percentage 34139 1726867643.70982: done checking for max_fail_percentage 34139 1726867643.70983: checking to see if all hosts have failed and the running result is not ok 34139 1726867643.70984: done checking to see if all hosts have failed 34139 1726867643.70985: getting the remaining hosts for this loop 34139 1726867643.70986: done getting the remaining hosts for this loop 34139 1726867643.70990: getting the next task for host managed_node1 34139 1726867643.70998: done getting next task for host managed_node1 34139 1726867643.71000: ^ task is: TASK: Fix CentOS6 Base repo 34139 1726867643.71003: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867643.71006: getting variables 34139 1726867643.71010: in VariableManager get_vars() 34139 1726867643.71043: Calling all_inventory to load vars for managed_node1 34139 1726867643.71046: Calling groups_inventory to load vars for managed_node1 34139 1726867643.71050: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.71059: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.71062: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.71070: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.71468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.71671: done with get_vars() 34139 1726867643.71683: done getting variables 34139 1726867643.71783: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 17:27:23 -0400 (0:00:00.035) 0:00:02.464 ****** 34139 1726867643.71810: entering _queue_task() for managed_node1/copy 34139 1726867643.72036: worker is 1 (out of 1 available) 34139 1726867643.72048: exiting _queue_task() for managed_node1/copy 34139 1726867643.72060: done queuing things up, now waiting for results queue to drain 34139 1726867643.72061: waiting for pending results... 34139 1726867643.72380: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 34139 1726867643.72420: in run() - task 0affcac9-a3a5-c103-b8fd-00000000015d 34139 1726867643.72433: variable 'ansible_search_path' from source: unknown 34139 1726867643.72437: variable 'ansible_search_path' from source: unknown 34139 1726867643.72545: calling self._execute() 34139 1726867643.72548: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867643.72551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867643.72554: variable 'omit' from source: magic vars 34139 1726867643.73043: variable 'ansible_distribution' from source: facts 34139 1726867643.73069: Evaluated conditional (ansible_distribution == 'CentOS'): True 34139 1726867643.73185: variable 'ansible_distribution_major_version' from source: facts 34139 1726867643.73191: Evaluated conditional (ansible_distribution_major_version == '6'): False 34139 1726867643.73194: when evaluation is False, skipping this task 34139 1726867643.73197: _execute() done 34139 1726867643.73199: dumping result to json 34139 1726867643.73203: done dumping result, returning 34139 1726867643.73214: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0affcac9-a3a5-c103-b8fd-00000000015d] 34139 1726867643.73219: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000015d 34139 1726867643.73324: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000015d 34139 1726867643.73328: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 34139 1726867643.73395: no more pending results, returning what we have 34139 1726867643.73399: results queue empty 34139 1726867643.73400: checking for any_errors_fatal 34139 1726867643.73406: done checking for any_errors_fatal 34139 1726867643.73406: checking for max_fail_percentage 34139 1726867643.73411: done checking for max_fail_percentage 34139 1726867643.73412: checking to see if all hosts have failed and the running result is not ok 34139 1726867643.73413: done checking to see if all hosts have failed 34139 1726867643.73414: getting the remaining hosts for this loop 34139 1726867643.73415: done getting the remaining hosts for this loop 34139 1726867643.73418: getting the next task for host managed_node1 34139 1726867643.73425: done getting next task for host managed_node1 34139 1726867643.73427: ^ task is: TASK: Include the task 'enable_epel.yml' 34139 1726867643.73431: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867643.73434: getting variables 34139 1726867643.73436: in VariableManager get_vars() 34139 1726867643.73464: Calling all_inventory to load vars for managed_node1 34139 1726867643.73467: Calling groups_inventory to load vars for managed_node1 34139 1726867643.73470: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.73484: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.73488: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.73491: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.73801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.74160: done with get_vars() 34139 1726867643.74168: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 17:27:23 -0400 (0:00:00.024) 0:00:02.488 ****** 34139 1726867643.74248: entering _queue_task() for managed_node1/include_tasks 34139 1726867643.74456: worker is 1 (out of 1 available) 34139 1726867643.74467: exiting _queue_task() for managed_node1/include_tasks 34139 1726867643.74582: done queuing things up, now waiting for results queue to drain 34139 1726867643.74585: waiting for pending results... 34139 1726867643.74876: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 34139 1726867643.74883: in run() - task 0affcac9-a3a5-c103-b8fd-00000000015e 34139 1726867643.74886: variable 'ansible_search_path' from source: unknown 34139 1726867643.74889: variable 'ansible_search_path' from source: unknown 34139 1726867643.74891: calling self._execute() 34139 1726867643.74923: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867643.74930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867643.74941: variable 'omit' from source: magic vars 34139 1726867643.75418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34139 1726867643.77575: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34139 1726867643.77646: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34139 1726867643.77681: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34139 1726867643.77938: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34139 1726867643.77941: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34139 1726867643.77945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34139 1726867643.77947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34139 1726867643.77950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34139 1726867643.77953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34139 1726867643.77959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34139 1726867643.78082: variable '__network_is_ostree' from source: set_fact 34139 1726867643.78102: Evaluated conditional (not __network_is_ostree | d(false)): True 34139 1726867643.78107: _execute() done 34139 1726867643.78110: dumping result to json 34139 1726867643.78117: done dumping result, returning 34139 1726867643.78123: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0affcac9-a3a5-c103-b8fd-00000000015e] 34139 1726867643.78126: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000015e 34139 1726867643.78221: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000015e 34139 1726867643.78224: WORKER PROCESS EXITING 34139 1726867643.78274: no more pending results, returning what we have 34139 1726867643.78281: in VariableManager get_vars() 34139 1726867643.78318: Calling all_inventory to load vars for managed_node1 34139 1726867643.78321: Calling groups_inventory to load vars for managed_node1 34139 1726867643.78325: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.78336: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.78339: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.78342: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.78705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.78917: done with get_vars() 34139 1726867643.78925: variable 'ansible_search_path' from source: unknown 34139 1726867643.78926: variable 'ansible_search_path' from source: unknown 34139 1726867643.78965: we have included files to process 34139 1726867643.78967: generating all_blocks data 34139 1726867643.78968: done generating all_blocks data 34139 1726867643.78973: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34139 1726867643.78974: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34139 1726867643.78978: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34139 1726867643.79711: done processing included file 34139 1726867643.79713: iterating over new_blocks loaded from include file 34139 1726867643.79714: in VariableManager get_vars() 34139 1726867643.79726: done with get_vars() 34139 1726867643.79727: filtering new block on tags 34139 1726867643.79749: done filtering new block on tags 34139 1726867643.79752: in VariableManager get_vars() 34139 1726867643.79763: done with get_vars() 34139 1726867643.79764: filtering new block on tags 34139 1726867643.79776: done filtering new block on tags 34139 1726867643.79779: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 34139 1726867643.79784: extending task lists for all hosts with included blocks 34139 1726867643.79888: done extending task lists 34139 1726867643.79890: done processing included files 34139 1726867643.79891: results queue empty 34139 1726867643.79891: checking for any_errors_fatal 34139 1726867643.79894: done checking for any_errors_fatal 34139 1726867643.79895: checking for max_fail_percentage 34139 1726867643.79896: done checking for max_fail_percentage 34139 1726867643.79896: checking to see if all hosts have failed and the running result is not ok 34139 1726867643.79897: done checking to see if all hosts have failed 34139 1726867643.79898: getting the remaining hosts for this loop 34139 1726867643.79899: done getting the remaining hosts for this loop 34139 1726867643.79901: getting the next task for host managed_node1 34139 1726867643.79905: done getting next task for host managed_node1 34139 1726867643.79910: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 34139 1726867643.79914: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867643.79916: getting variables 34139 1726867643.79917: in VariableManager get_vars() 34139 1726867643.79924: Calling all_inventory to load vars for managed_node1 34139 1726867643.79926: Calling groups_inventory to load vars for managed_node1 34139 1726867643.79929: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.79934: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.79940: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.79944: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.80102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.80284: done with get_vars() 34139 1726867643.80292: done getting variables 34139 1726867643.80350: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 34139 1726867643.80542: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 17:27:23 -0400 (0:00:00.063) 0:00:02.551 ****** 34139 1726867643.80588: entering _queue_task() for managed_node1/command 34139 1726867643.80590: Creating lock for command 34139 1726867643.80831: worker is 1 (out of 1 available) 34139 1726867643.80843: exiting _queue_task() for managed_node1/command 34139 1726867643.80855: done queuing things up, now waiting for results queue to drain 34139 1726867643.80857: waiting for pending results... 34139 1726867643.81102: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 34139 1726867643.81195: in run() - task 0affcac9-a3a5-c103-b8fd-000000000178 34139 1726867643.81209: variable 'ansible_search_path' from source: unknown 34139 1726867643.81213: variable 'ansible_search_path' from source: unknown 34139 1726867643.81248: calling self._execute() 34139 1726867643.81374: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867643.81380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867643.81383: variable 'omit' from source: magic vars 34139 1726867643.81701: variable 'ansible_distribution' from source: facts 34139 1726867643.81709: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34139 1726867643.81840: variable 'ansible_distribution_major_version' from source: facts 34139 1726867643.81852: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34139 1726867643.81855: when evaluation is False, skipping this task 34139 1726867643.81858: _execute() done 34139 1726867643.81861: dumping result to json 34139 1726867643.81863: done dumping result, returning 34139 1726867643.81871: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [0affcac9-a3a5-c103-b8fd-000000000178] 34139 1726867643.81875: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000178 34139 1726867643.81975: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000178 34139 1726867643.81980: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34139 1726867643.82032: no more pending results, returning what we have 34139 1726867643.82035: results queue empty 34139 1726867643.82035: checking for any_errors_fatal 34139 1726867643.82037: done checking for any_errors_fatal 34139 1726867643.82038: checking for max_fail_percentage 34139 1726867643.82040: done checking for max_fail_percentage 34139 1726867643.82040: checking to see if all hosts have failed and the running result is not ok 34139 1726867643.82041: done checking to see if all hosts have failed 34139 1726867643.82042: getting the remaining hosts for this loop 34139 1726867643.82043: done getting the remaining hosts for this loop 34139 1726867643.82046: getting the next task for host managed_node1 34139 1726867643.82055: done getting next task for host managed_node1 34139 1726867643.82057: ^ task is: TASK: Install yum-utils package 34139 1726867643.82061: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867643.82065: getting variables 34139 1726867643.82067: in VariableManager get_vars() 34139 1726867643.82096: Calling all_inventory to load vars for managed_node1 34139 1726867643.82099: Calling groups_inventory to load vars for managed_node1 34139 1726867643.82102: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.82116: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.82119: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.82122: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.82419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.82659: done with get_vars() 34139 1726867643.82667: done getting variables 34139 1726867643.82754: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 17:27:23 -0400 (0:00:00.021) 0:00:02.573 ****** 34139 1726867643.82781: entering _queue_task() for managed_node1/package 34139 1726867643.82783: Creating lock for package 34139 1726867643.82992: worker is 1 (out of 1 available) 34139 1726867643.83002: exiting _queue_task() for managed_node1/package 34139 1726867643.83015: done queuing things up, now waiting for results queue to drain 34139 1726867643.83016: waiting for pending results... 34139 1726867643.83294: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 34139 1726867643.83328: in run() - task 0affcac9-a3a5-c103-b8fd-000000000179 34139 1726867643.83340: variable 'ansible_search_path' from source: unknown 34139 1726867643.83343: variable 'ansible_search_path' from source: unknown 34139 1726867643.83373: calling self._execute() 34139 1726867643.83445: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867643.83449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867643.83460: variable 'omit' from source: magic vars 34139 1726867643.83810: variable 'ansible_distribution' from source: facts 34139 1726867643.83829: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34139 1726867643.83955: variable 'ansible_distribution_major_version' from source: facts 34139 1726867643.83961: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34139 1726867643.83964: when evaluation is False, skipping this task 34139 1726867643.83967: _execute() done 34139 1726867643.83970: dumping result to json 34139 1726867643.83980: done dumping result, returning 34139 1726867643.83984: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0affcac9-a3a5-c103-b8fd-000000000179] 34139 1726867643.83987: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000179 34139 1726867643.84079: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000179 34139 1726867643.84082: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34139 1726867643.84127: no more pending results, returning what we have 34139 1726867643.84130: results queue empty 34139 1726867643.84131: checking for any_errors_fatal 34139 1726867643.84137: done checking for any_errors_fatal 34139 1726867643.84137: checking for max_fail_percentage 34139 1726867643.84139: done checking for max_fail_percentage 34139 1726867643.84139: checking to see if all hosts have failed and the running result is not ok 34139 1726867643.84140: done checking to see if all hosts have failed 34139 1726867643.84141: getting the remaining hosts for this loop 34139 1726867643.84142: done getting the remaining hosts for this loop 34139 1726867643.84145: getting the next task for host managed_node1 34139 1726867643.84151: done getting next task for host managed_node1 34139 1726867643.84153: ^ task is: TASK: Enable EPEL 7 34139 1726867643.84156: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867643.84159: getting variables 34139 1726867643.84161: in VariableManager get_vars() 34139 1726867643.84189: Calling all_inventory to load vars for managed_node1 34139 1726867643.84192: Calling groups_inventory to load vars for managed_node1 34139 1726867643.84195: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.84207: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.84212: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.84214: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.84497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.84744: done with get_vars() 34139 1726867643.84753: done getting variables 34139 1726867643.84810: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 17:27:23 -0400 (0:00:00.020) 0:00:02.594 ****** 34139 1726867643.84837: entering _queue_task() for managed_node1/command 34139 1726867643.85063: worker is 1 (out of 1 available) 34139 1726867643.85076: exiting _queue_task() for managed_node1/command 34139 1726867643.85090: done queuing things up, now waiting for results queue to drain 34139 1726867643.85092: waiting for pending results... 34139 1726867643.85350: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 34139 1726867643.85442: in run() - task 0affcac9-a3a5-c103-b8fd-00000000017a 34139 1726867643.85453: variable 'ansible_search_path' from source: unknown 34139 1726867643.85456: variable 'ansible_search_path' from source: unknown 34139 1726867643.85532: calling self._execute() 34139 1726867643.85563: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867643.85567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867643.85580: variable 'omit' from source: magic vars 34139 1726867643.85968: variable 'ansible_distribution' from source: facts 34139 1726867643.85972: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34139 1726867643.86087: variable 'ansible_distribution_major_version' from source: facts 34139 1726867643.86091: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34139 1726867643.86094: when evaluation is False, skipping this task 34139 1726867643.86119: _execute() done 34139 1726867643.86122: dumping result to json 34139 1726867643.86124: done dumping result, returning 34139 1726867643.86127: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0affcac9-a3a5-c103-b8fd-00000000017a] 34139 1726867643.86129: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000017a 34139 1726867643.86255: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000017a 34139 1726867643.86259: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34139 1726867643.86303: no more pending results, returning what we have 34139 1726867643.86306: results queue empty 34139 1726867643.86307: checking for any_errors_fatal 34139 1726867643.86315: done checking for any_errors_fatal 34139 1726867643.86316: checking for max_fail_percentage 34139 1726867643.86317: done checking for max_fail_percentage 34139 1726867643.86318: checking to see if all hosts have failed and the running result is not ok 34139 1726867643.86318: done checking to see if all hosts have failed 34139 1726867643.86319: getting the remaining hosts for this loop 34139 1726867643.86320: done getting the remaining hosts for this loop 34139 1726867643.86324: getting the next task for host managed_node1 34139 1726867643.86331: done getting next task for host managed_node1 34139 1726867643.86333: ^ task is: TASK: Enable EPEL 8 34139 1726867643.86336: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867643.86339: getting variables 34139 1726867643.86341: in VariableManager get_vars() 34139 1726867643.86367: Calling all_inventory to load vars for managed_node1 34139 1726867643.86370: Calling groups_inventory to load vars for managed_node1 34139 1726867643.86373: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.86386: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.86389: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.86392: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.86671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.86876: done with get_vars() 34139 1726867643.86887: done getting variables 34139 1726867643.86937: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 17:27:23 -0400 (0:00:00.021) 0:00:02.615 ****** 34139 1726867643.86959: entering _queue_task() for managed_node1/command 34139 1726867643.87168: worker is 1 (out of 1 available) 34139 1726867643.87283: exiting _queue_task() for managed_node1/command 34139 1726867643.87292: done queuing things up, now waiting for results queue to drain 34139 1726867643.87294: waiting for pending results... 34139 1726867643.87585: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 34139 1726867643.87591: in run() - task 0affcac9-a3a5-c103-b8fd-00000000017b 34139 1726867643.87594: variable 'ansible_search_path' from source: unknown 34139 1726867643.87596: variable 'ansible_search_path' from source: unknown 34139 1726867643.87599: calling self._execute() 34139 1726867643.87619: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867643.87628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867643.87638: variable 'omit' from source: magic vars 34139 1726867643.87992: variable 'ansible_distribution' from source: facts 34139 1726867643.88006: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34139 1726867643.88132: variable 'ansible_distribution_major_version' from source: facts 34139 1726867643.88135: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34139 1726867643.88140: when evaluation is False, skipping this task 34139 1726867643.88143: _execute() done 34139 1726867643.88145: dumping result to json 34139 1726867643.88148: done dumping result, returning 34139 1726867643.88155: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0affcac9-a3a5-c103-b8fd-00000000017b] 34139 1726867643.88159: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000017b 34139 1726867643.88363: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000017b 34139 1726867643.88367: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34139 1726867643.88404: no more pending results, returning what we have 34139 1726867643.88407: results queue empty 34139 1726867643.88407: checking for any_errors_fatal 34139 1726867643.88411: done checking for any_errors_fatal 34139 1726867643.88412: checking for max_fail_percentage 34139 1726867643.88413: done checking for max_fail_percentage 34139 1726867643.88414: checking to see if all hosts have failed and the running result is not ok 34139 1726867643.88415: done checking to see if all hosts have failed 34139 1726867643.88416: getting the remaining hosts for this loop 34139 1726867643.88416: done getting the remaining hosts for this loop 34139 1726867643.88419: getting the next task for host managed_node1 34139 1726867643.88426: done getting next task for host managed_node1 34139 1726867643.88428: ^ task is: TASK: Enable EPEL 6 34139 1726867643.88432: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867643.88434: getting variables 34139 1726867643.88440: in VariableManager get_vars() 34139 1726867643.88463: Calling all_inventory to load vars for managed_node1 34139 1726867643.88465: Calling groups_inventory to load vars for managed_node1 34139 1726867643.88469: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.88476: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.88481: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.88484: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.88665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.88890: done with get_vars() 34139 1726867643.88900: done getting variables 34139 1726867643.88953: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 17:27:23 -0400 (0:00:00.020) 0:00:02.635 ****** 34139 1726867643.88985: entering _queue_task() for managed_node1/copy 34139 1726867643.89186: worker is 1 (out of 1 available) 34139 1726867643.89311: exiting _queue_task() for managed_node1/copy 34139 1726867643.89321: done queuing things up, now waiting for results queue to drain 34139 1726867643.89323: waiting for pending results... 34139 1726867643.89539: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 34139 1726867643.89551: in run() - task 0affcac9-a3a5-c103-b8fd-00000000017d 34139 1726867643.89568: variable 'ansible_search_path' from source: unknown 34139 1726867643.89574: variable 'ansible_search_path' from source: unknown 34139 1726867643.89614: calling self._execute() 34139 1726867643.89692: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867643.89703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867643.89716: variable 'omit' from source: magic vars 34139 1726867643.90140: variable 'ansible_distribution' from source: facts 34139 1726867643.90156: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34139 1726867643.90276: variable 'ansible_distribution_major_version' from source: facts 34139 1726867643.90292: Evaluated conditional (ansible_distribution_major_version == '6'): False 34139 1726867643.90384: when evaluation is False, skipping this task 34139 1726867643.90388: _execute() done 34139 1726867643.90391: dumping result to json 34139 1726867643.90393: done dumping result, returning 34139 1726867643.90396: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0affcac9-a3a5-c103-b8fd-00000000017d] 34139 1726867643.90398: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000017d 34139 1726867643.90464: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000017d 34139 1726867643.90467: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 34139 1726867643.90513: no more pending results, returning what we have 34139 1726867643.90516: results queue empty 34139 1726867643.90517: checking for any_errors_fatal 34139 1726867643.90521: done checking for any_errors_fatal 34139 1726867643.90522: checking for max_fail_percentage 34139 1726867643.90523: done checking for max_fail_percentage 34139 1726867643.90524: checking to see if all hosts have failed and the running result is not ok 34139 1726867643.90525: done checking to see if all hosts have failed 34139 1726867643.90525: getting the remaining hosts for this loop 34139 1726867643.90527: done getting the remaining hosts for this loop 34139 1726867643.90530: getting the next task for host managed_node1 34139 1726867643.90539: done getting next task for host managed_node1 34139 1726867643.90542: ^ task is: TASK: Set network provider to 'nm' 34139 1726867643.90545: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867643.90549: getting variables 34139 1726867643.90550: in VariableManager get_vars() 34139 1726867643.90724: Calling all_inventory to load vars for managed_node1 34139 1726867643.90728: Calling groups_inventory to load vars for managed_node1 34139 1726867643.90731: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.90739: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.90742: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.90745: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.90947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.91143: done with get_vars() 34139 1726867643.91152: done getting variables 34139 1726867643.91214: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:13 Friday 20 September 2024 17:27:23 -0400 (0:00:00.022) 0:00:02.658 ****** 34139 1726867643.91240: entering _queue_task() for managed_node1/set_fact 34139 1726867643.91498: worker is 1 (out of 1 available) 34139 1726867643.91511: exiting _queue_task() for managed_node1/set_fact 34139 1726867643.91523: done queuing things up, now waiting for results queue to drain 34139 1726867643.91524: waiting for pending results... 34139 1726867643.92004: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 34139 1726867643.92010: in run() - task 0affcac9-a3a5-c103-b8fd-000000000007 34139 1726867643.92014: variable 'ansible_search_path' from source: unknown 34139 1726867643.92017: calling self._execute() 34139 1726867643.92046: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867643.92059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867643.92074: variable 'omit' from source: magic vars 34139 1726867643.92207: variable 'omit' from source: magic vars 34139 1726867643.92246: variable 'omit' from source: magic vars 34139 1726867643.92293: variable 'omit' from source: magic vars 34139 1726867643.92449: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34139 1726867643.92489: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34139 1726867643.92542: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34139 1726867643.92545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34139 1726867643.92563: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34139 1726867643.92602: variable 'inventory_hostname' from source: host vars for 'managed_node1' 34139 1726867643.92611: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867643.92651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867643.92726: Set connection var ansible_timeout to 10 34139 1726867643.92737: Set connection var ansible_shell_type to sh 34139 1726867643.92744: Set connection var ansible_shell_executable to /bin/sh 34139 1726867643.92757: Set connection var ansible_pipelining to False 34139 1726867643.92767: Set connection var ansible_connection to ssh 34139 1726867643.92868: Set connection var ansible_module_compression to ZIP_DEFLATED 34139 1726867643.92871: variable 'ansible_shell_executable' from source: unknown 34139 1726867643.92874: variable 'ansible_connection' from source: unknown 34139 1726867643.92876: variable 'ansible_module_compression' from source: unknown 34139 1726867643.92879: variable 'ansible_shell_type' from source: unknown 34139 1726867643.92882: variable 'ansible_shell_executable' from source: unknown 34139 1726867643.92884: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867643.92886: variable 'ansible_pipelining' from source: unknown 34139 1726867643.92888: variable 'ansible_timeout' from source: unknown 34139 1726867643.92890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867643.92981: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34139 1726867643.92998: variable 'omit' from source: magic vars 34139 1726867643.93007: starting attempt loop 34139 1726867643.93013: running the handler 34139 1726867643.93027: handler run complete 34139 1726867643.93040: attempt loop complete, returning result 34139 1726867643.93046: _execute() done 34139 1726867643.93052: dumping result to json 34139 1726867643.93059: done dumping result, returning 34139 1726867643.93069: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [0affcac9-a3a5-c103-b8fd-000000000007] 34139 1726867643.93076: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000007 ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 34139 1726867643.93237: no more pending results, returning what we have 34139 1726867643.93240: results queue empty 34139 1726867643.93241: checking for any_errors_fatal 34139 1726867643.93247: done checking for any_errors_fatal 34139 1726867643.93248: checking for max_fail_percentage 34139 1726867643.93250: done checking for max_fail_percentage 34139 1726867643.93251: checking to see if all hosts have failed and the running result is not ok 34139 1726867643.93252: done checking to see if all hosts have failed 34139 1726867643.93253: getting the remaining hosts for this loop 34139 1726867643.93255: done getting the remaining hosts for this loop 34139 1726867643.93258: getting the next task for host managed_node1 34139 1726867643.93266: done getting next task for host managed_node1 34139 1726867643.93268: ^ task is: TASK: meta (flush_handlers) 34139 1726867643.93270: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867643.93273: getting variables 34139 1726867643.93276: in VariableManager get_vars() 34139 1726867643.93305: Calling all_inventory to load vars for managed_node1 34139 1726867643.93309: Calling groups_inventory to load vars for managed_node1 34139 1726867643.93312: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.93322: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.93324: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.93327: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.93704: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000007 34139 1726867643.93707: WORKER PROCESS EXITING 34139 1726867643.93730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.93949: done with get_vars() 34139 1726867643.93958: done getting variables 34139 1726867643.94024: in VariableManager get_vars() 34139 1726867643.94033: Calling all_inventory to load vars for managed_node1 34139 1726867643.94036: Calling groups_inventory to load vars for managed_node1 34139 1726867643.94038: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.94042: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.94044: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.94047: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.94225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.94437: done with get_vars() 34139 1726867643.94451: done queuing things up, now waiting for results queue to drain 34139 1726867643.94453: results queue empty 34139 1726867643.94454: checking for any_errors_fatal 34139 1726867643.94460: done checking for any_errors_fatal 34139 1726867643.94461: checking for max_fail_percentage 34139 1726867643.94462: done checking for max_fail_percentage 34139 1726867643.94463: checking to see if all hosts have failed and the running result is not ok 34139 1726867643.94464: done checking to see if all hosts have failed 34139 1726867643.94465: getting the remaining hosts for this loop 34139 1726867643.94466: done getting the remaining hosts for this loop 34139 1726867643.94468: getting the next task for host managed_node1 34139 1726867643.94471: done getting next task for host managed_node1 34139 1726867643.94473: ^ task is: TASK: meta (flush_handlers) 34139 1726867643.94474: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867643.94483: getting variables 34139 1726867643.94484: in VariableManager get_vars() 34139 1726867643.94491: Calling all_inventory to load vars for managed_node1 34139 1726867643.94493: Calling groups_inventory to load vars for managed_node1 34139 1726867643.94495: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.94500: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.94502: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.94504: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.94656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.94867: done with get_vars() 34139 1726867643.94875: done getting variables 34139 1726867643.94923: in VariableManager get_vars() 34139 1726867643.94931: Calling all_inventory to load vars for managed_node1 34139 1726867643.94933: Calling groups_inventory to load vars for managed_node1 34139 1726867643.94935: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.94939: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.94941: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.94944: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.95120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.95328: done with get_vars() 34139 1726867643.95339: done queuing things up, now waiting for results queue to drain 34139 1726867643.95340: results queue empty 34139 1726867643.95341: checking for any_errors_fatal 34139 1726867643.95342: done checking for any_errors_fatal 34139 1726867643.95343: checking for max_fail_percentage 34139 1726867643.95344: done checking for max_fail_percentage 34139 1726867643.95344: checking to see if all hosts have failed and the running result is not ok 34139 1726867643.95345: done checking to see if all hosts have failed 34139 1726867643.95346: getting the remaining hosts for this loop 34139 1726867643.95346: done getting the remaining hosts for this loop 34139 1726867643.95349: getting the next task for host managed_node1 34139 1726867643.95351: done getting next task for host managed_node1 34139 1726867643.95352: ^ task is: None 34139 1726867643.95353: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867643.95354: done queuing things up, now waiting for results queue to drain 34139 1726867643.95355: results queue empty 34139 1726867643.95356: checking for any_errors_fatal 34139 1726867643.95356: done checking for any_errors_fatal 34139 1726867643.95357: checking for max_fail_percentage 34139 1726867643.95358: done checking for max_fail_percentage 34139 1726867643.95359: checking to see if all hosts have failed and the running result is not ok 34139 1726867643.95359: done checking to see if all hosts have failed 34139 1726867643.95361: getting the next task for host managed_node1 34139 1726867643.95363: done getting next task for host managed_node1 34139 1726867643.95364: ^ task is: None 34139 1726867643.95365: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867643.95411: in VariableManager get_vars() 34139 1726867643.95445: done with get_vars() 34139 1726867643.95452: in VariableManager get_vars() 34139 1726867643.95470: done with get_vars() 34139 1726867643.95474: variable 'omit' from source: magic vars 34139 1726867643.95507: in VariableManager get_vars() 34139 1726867643.95527: done with get_vars() 34139 1726867643.95555: variable 'omit' from source: magic vars PLAY [Play for testing wireless connection] ************************************ 34139 1726867643.96266: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 34139 1726867643.96300: getting the remaining hosts for this loop 34139 1726867643.96302: done getting the remaining hosts for this loop 34139 1726867643.96304: getting the next task for host managed_node1 34139 1726867643.96307: done getting next task for host managed_node1 34139 1726867643.96309: ^ task is: TASK: Gathering Facts 34139 1726867643.96310: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867643.96312: getting variables 34139 1726867643.96313: in VariableManager get_vars() 34139 1726867643.96330: Calling all_inventory to load vars for managed_node1 34139 1726867643.96332: Calling groups_inventory to load vars for managed_node1 34139 1726867643.96334: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.96339: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.96353: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.96355: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.96507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.96750: done with get_vars() 34139 1726867643.96758: done getting variables 34139 1726867643.96803: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:3 Friday 20 September 2024 17:27:23 -0400 (0:00:00.055) 0:00:02.714 ****** 34139 1726867643.96827: entering _queue_task() for managed_node1/gather_facts 34139 1726867643.97285: worker is 1 (out of 1 available) 34139 1726867643.97296: exiting _queue_task() for managed_node1/gather_facts 34139 1726867643.97305: done queuing things up, now waiting for results queue to drain 34139 1726867643.97307: waiting for pending results... 34139 1726867643.97546: running TaskExecutor() for managed_node1/TASK: Gathering Facts 34139 1726867643.97700: in run() - task 0affcac9-a3a5-c103-b8fd-0000000001a3 34139 1726867643.97705: variable 'ansible_search_path' from source: unknown 34139 1726867643.97765: calling self._execute() 34139 1726867643.97888: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867643.97901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867643.97933: variable 'omit' from source: magic vars 34139 1726867643.98336: variable 'ansible_distribution_major_version' from source: facts 34139 1726867643.98371: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867643.98490: variable 'ansible_distribution_major_version' from source: facts 34139 1726867643.98586: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867643.98590: when evaluation is False, skipping this task 34139 1726867643.98592: _execute() done 34139 1726867643.98594: dumping result to json 34139 1726867643.98596: done dumping result, returning 34139 1726867643.98599: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affcac9-a3a5-c103-b8fd-0000000001a3] 34139 1726867643.98601: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000001a3 34139 1726867643.98661: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000001a3 34139 1726867643.98664: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867643.98735: no more pending results, returning what we have 34139 1726867643.98739: results queue empty 34139 1726867643.98739: checking for any_errors_fatal 34139 1726867643.98741: done checking for any_errors_fatal 34139 1726867643.98741: checking for max_fail_percentage 34139 1726867643.98743: done checking for max_fail_percentage 34139 1726867643.98744: checking to see if all hosts have failed and the running result is not ok 34139 1726867643.98745: done checking to see if all hosts have failed 34139 1726867643.98745: getting the remaining hosts for this loop 34139 1726867643.98746: done getting the remaining hosts for this loop 34139 1726867643.98750: getting the next task for host managed_node1 34139 1726867643.98757: done getting next task for host managed_node1 34139 1726867643.98759: ^ task is: TASK: meta (flush_handlers) 34139 1726867643.98761: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867643.98765: getting variables 34139 1726867643.98767: in VariableManager get_vars() 34139 1726867643.98953: Calling all_inventory to load vars for managed_node1 34139 1726867643.98956: Calling groups_inventory to load vars for managed_node1 34139 1726867643.98958: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.98966: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.98968: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.98970: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.99202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.99341: done with get_vars() 34139 1726867643.99348: done getting variables 34139 1726867643.99395: in VariableManager get_vars() 34139 1726867643.99406: Calling all_inventory to load vars for managed_node1 34139 1726867643.99409: Calling groups_inventory to load vars for managed_node1 34139 1726867643.99411: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.99416: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.99419: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.99420: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.99505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.99623: done with get_vars() 34139 1726867643.99632: done queuing things up, now waiting for results queue to drain 34139 1726867643.99634: results queue empty 34139 1726867643.99635: checking for any_errors_fatal 34139 1726867643.99636: done checking for any_errors_fatal 34139 1726867643.99637: checking for max_fail_percentage 34139 1726867643.99637: done checking for max_fail_percentage 34139 1726867643.99638: checking to see if all hosts have failed and the running result is not ok 34139 1726867643.99638: done checking to see if all hosts have failed 34139 1726867643.99639: getting the remaining hosts for this loop 34139 1726867643.99639: done getting the remaining hosts for this loop 34139 1726867643.99641: getting the next task for host managed_node1 34139 1726867643.99643: done getting next task for host managed_node1 34139 1726867643.99644: ^ task is: TASK: INIT: wireless tests 34139 1726867643.99645: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867643.99647: getting variables 34139 1726867643.99647: in VariableManager get_vars() 34139 1726867643.99658: Calling all_inventory to load vars for managed_node1 34139 1726867643.99659: Calling groups_inventory to load vars for managed_node1 34139 1726867643.99661: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867643.99664: Calling all_plugins_play to load vars for managed_node1 34139 1726867643.99665: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867643.99666: Calling groups_plugins_play to load vars for managed_node1 34139 1726867643.99771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867643.99888: done with get_vars() 34139 1726867643.99893: done getting variables 34139 1726867643.99949: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT: wireless tests] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:8 Friday 20 September 2024 17:27:23 -0400 (0:00:00.031) 0:00:02.745 ****** 34139 1726867643.99968: entering _queue_task() for managed_node1/debug 34139 1726867643.99969: Creating lock for debug 34139 1726867644.00181: worker is 1 (out of 1 available) 34139 1726867644.00196: exiting _queue_task() for managed_node1/debug 34139 1726867644.00206: done queuing things up, now waiting for results queue to drain 34139 1726867644.00212: waiting for pending results... 34139 1726867644.00361: running TaskExecutor() for managed_node1/TASK: INIT: wireless tests 34139 1726867644.00415: in run() - task 0affcac9-a3a5-c103-b8fd-00000000000b 34139 1726867644.00427: variable 'ansible_search_path' from source: unknown 34139 1726867644.00457: calling self._execute() 34139 1726867644.00519: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.00523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.00531: variable 'omit' from source: magic vars 34139 1726867644.00815: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.00818: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.00947: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.00951: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.00953: when evaluation is False, skipping this task 34139 1726867644.00956: _execute() done 34139 1726867644.00958: dumping result to json 34139 1726867644.00960: done dumping result, returning 34139 1726867644.00963: done running TaskExecutor() for managed_node1/TASK: INIT: wireless tests [0affcac9-a3a5-c103-b8fd-00000000000b] 34139 1726867644.00966: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000000b 34139 1726867644.01194: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000000b 34139 1726867644.01197: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867644.01236: no more pending results, returning what we have 34139 1726867644.01239: results queue empty 34139 1726867644.01239: checking for any_errors_fatal 34139 1726867644.01241: done checking for any_errors_fatal 34139 1726867644.01241: checking for max_fail_percentage 34139 1726867644.01242: done checking for max_fail_percentage 34139 1726867644.01243: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.01244: done checking to see if all hosts have failed 34139 1726867644.01245: getting the remaining hosts for this loop 34139 1726867644.01245: done getting the remaining hosts for this loop 34139 1726867644.01248: getting the next task for host managed_node1 34139 1726867644.01254: done getting next task for host managed_node1 34139 1726867644.01256: ^ task is: TASK: Include the task 'setup_mock_wifi.yml' 34139 1726867644.01258: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.01261: getting variables 34139 1726867644.01263: in VariableManager get_vars() 34139 1726867644.01303: Calling all_inventory to load vars for managed_node1 34139 1726867644.01306: Calling groups_inventory to load vars for managed_node1 34139 1726867644.01311: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.01319: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.01322: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.01324: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.01497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.01712: done with get_vars() 34139 1726867644.01721: done getting variables TASK [Include the task 'setup_mock_wifi.yml'] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:11 Friday 20 September 2024 17:27:24 -0400 (0:00:00.018) 0:00:02.764 ****** 34139 1726867644.01806: entering _queue_task() for managed_node1/include_tasks 34139 1726867644.02052: worker is 1 (out of 1 available) 34139 1726867644.02065: exiting _queue_task() for managed_node1/include_tasks 34139 1726867644.02078: done queuing things up, now waiting for results queue to drain 34139 1726867644.02080: waiting for pending results... 34139 1726867644.02264: running TaskExecutor() for managed_node1/TASK: Include the task 'setup_mock_wifi.yml' 34139 1726867644.02321: in run() - task 0affcac9-a3a5-c103-b8fd-00000000000c 34139 1726867644.02332: variable 'ansible_search_path' from source: unknown 34139 1726867644.02361: calling self._execute() 34139 1726867644.02422: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.02425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.02434: variable 'omit' from source: magic vars 34139 1726867644.02696: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.02713: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.02786: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.02790: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.02793: when evaluation is False, skipping this task 34139 1726867644.02796: _execute() done 34139 1726867644.02798: dumping result to json 34139 1726867644.02803: done dumping result, returning 34139 1726867644.02813: done running TaskExecutor() for managed_node1/TASK: Include the task 'setup_mock_wifi.yml' [0affcac9-a3a5-c103-b8fd-00000000000c] 34139 1726867644.02816: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000000c 34139 1726867644.02899: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000000c 34139 1726867644.02902: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.02955: no more pending results, returning what we have 34139 1726867644.02958: results queue empty 34139 1726867644.02959: checking for any_errors_fatal 34139 1726867644.02964: done checking for any_errors_fatal 34139 1726867644.02964: checking for max_fail_percentage 34139 1726867644.02966: done checking for max_fail_percentage 34139 1726867644.02966: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.02967: done checking to see if all hosts have failed 34139 1726867644.02968: getting the remaining hosts for this loop 34139 1726867644.02969: done getting the remaining hosts for this loop 34139 1726867644.02972: getting the next task for host managed_node1 34139 1726867644.02976: done getting next task for host managed_node1 34139 1726867644.02981: ^ task is: TASK: Copy client certs 34139 1726867644.02983: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.02985: getting variables 34139 1726867644.02986: in VariableManager get_vars() 34139 1726867644.03023: Calling all_inventory to load vars for managed_node1 34139 1726867644.03026: Calling groups_inventory to load vars for managed_node1 34139 1726867644.03028: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.03038: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.03040: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.03043: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.03186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.03305: done with get_vars() 34139 1726867644.03315: done getting variables 34139 1726867644.03352: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Copy client certs] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:13 Friday 20 September 2024 17:27:24 -0400 (0:00:00.015) 0:00:02.779 ****** 34139 1726867644.03371: entering _queue_task() for managed_node1/copy 34139 1726867644.03547: worker is 1 (out of 1 available) 34139 1726867644.03560: exiting _queue_task() for managed_node1/copy 34139 1726867644.03571: done queuing things up, now waiting for results queue to drain 34139 1726867644.03573: waiting for pending results... 34139 1726867644.03719: running TaskExecutor() for managed_node1/TASK: Copy client certs 34139 1726867644.03767: in run() - task 0affcac9-a3a5-c103-b8fd-00000000000d 34139 1726867644.03778: variable 'ansible_search_path' from source: unknown 34139 1726867644.03959: Loaded config def from plugin (lookup/items) 34139 1726867644.03974: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 34139 1726867644.04015: variable 'omit' from source: magic vars 34139 1726867644.04383: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.04386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.04388: variable 'omit' from source: magic vars 34139 1726867644.04490: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.04505: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.04621: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.04633: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.04642: when evaluation is False, skipping this task 34139 1726867644.04670: variable 'item' from source: unknown 34139 1726867644.04746: variable 'item' from source: unknown skipping: [managed_node1] => (item=client.key) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "client.key", "skip_reason": "Conditional result was False" } 34139 1726867644.04950: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.04963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.04979: variable 'omit' from source: magic vars 34139 1726867644.05132: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.05143: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.05251: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.05263: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.05270: when evaluation is False, skipping this task 34139 1726867644.05298: variable 'item' from source: unknown 34139 1726867644.05361: variable 'item' from source: unknown skipping: [managed_node1] => (item=client.pem) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "client.pem", "skip_reason": "Conditional result was False" } 34139 1726867644.05682: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.05685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.05688: variable 'omit' from source: magic vars 34139 1726867644.05690: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.05692: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.05782: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.05793: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.05800: when evaluation is False, skipping this task 34139 1726867644.05830: variable 'item' from source: unknown 34139 1726867644.05892: variable 'item' from source: unknown skipping: [managed_node1] => (item=cacert.pem) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "cacert.pem", "skip_reason": "Conditional result was False" } 34139 1726867644.05984: dumping result to json 34139 1726867644.05995: done dumping result, returning 34139 1726867644.06005: done running TaskExecutor() for managed_node1/TASK: Copy client certs [0affcac9-a3a5-c103-b8fd-00000000000d] 34139 1726867644.06017: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000000d skipping: [managed_node1] => { "changed": false } MSG: All items skipped 34139 1726867644.06128: no more pending results, returning what we have 34139 1726867644.06132: results queue empty 34139 1726867644.06132: checking for any_errors_fatal 34139 1726867644.06137: done checking for any_errors_fatal 34139 1726867644.06137: checking for max_fail_percentage 34139 1726867644.06141: done checking for max_fail_percentage 34139 1726867644.06141: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.06142: done checking to see if all hosts have failed 34139 1726867644.06143: getting the remaining hosts for this loop 34139 1726867644.06144: done getting the remaining hosts for this loop 34139 1726867644.06147: getting the next task for host managed_node1 34139 1726867644.06154: done getting next task for host managed_node1 34139 1726867644.06156: ^ task is: TASK: TEST: wireless connection with WPA-PSK 34139 1726867644.06158: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.06161: getting variables 34139 1726867644.06162: in VariableManager get_vars() 34139 1726867644.06207: Calling all_inventory to load vars for managed_node1 34139 1726867644.06210: Calling groups_inventory to load vars for managed_node1 34139 1726867644.06212: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.06223: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.06226: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.06229: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.06468: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000000d 34139 1726867644.06471: WORKER PROCESS EXITING 34139 1726867644.06495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.06948: done with get_vars() 34139 1726867644.06957: done getting variables 34139 1726867644.07014: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: wireless connection with WPA-PSK] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:24 Friday 20 September 2024 17:27:24 -0400 (0:00:00.036) 0:00:02.816 ****** 34139 1726867644.07037: entering _queue_task() for managed_node1/debug 34139 1726867644.07294: worker is 1 (out of 1 available) 34139 1726867644.07305: exiting _queue_task() for managed_node1/debug 34139 1726867644.07319: done queuing things up, now waiting for results queue to drain 34139 1726867644.07321: waiting for pending results... 34139 1726867644.07562: running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with WPA-PSK 34139 1726867644.07666: in run() - task 0affcac9-a3a5-c103-b8fd-00000000000f 34139 1726867644.07693: variable 'ansible_search_path' from source: unknown 34139 1726867644.07734: calling self._execute() 34139 1726867644.07823: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.07834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.07851: variable 'omit' from source: magic vars 34139 1726867644.08231: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.08249: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.08371: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.08482: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.08486: when evaluation is False, skipping this task 34139 1726867644.08488: _execute() done 34139 1726867644.08490: dumping result to json 34139 1726867644.08493: done dumping result, returning 34139 1726867644.08494: done running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with WPA-PSK [0affcac9-a3a5-c103-b8fd-00000000000f] 34139 1726867644.08497: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000000f 34139 1726867644.08562: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000000f 34139 1726867644.08565: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867644.08618: no more pending results, returning what we have 34139 1726867644.08621: results queue empty 34139 1726867644.08622: checking for any_errors_fatal 34139 1726867644.08632: done checking for any_errors_fatal 34139 1726867644.08632: checking for max_fail_percentage 34139 1726867644.08634: done checking for max_fail_percentage 34139 1726867644.08634: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.08635: done checking to see if all hosts have failed 34139 1726867644.08636: getting the remaining hosts for this loop 34139 1726867644.08637: done getting the remaining hosts for this loop 34139 1726867644.08641: getting the next task for host managed_node1 34139 1726867644.08649: done getting next task for host managed_node1 34139 1726867644.08655: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34139 1726867644.08658: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.08674: getting variables 34139 1726867644.08676: in VariableManager get_vars() 34139 1726867644.08727: Calling all_inventory to load vars for managed_node1 34139 1726867644.08729: Calling groups_inventory to load vars for managed_node1 34139 1726867644.08732: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.08742: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.08745: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.08747: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.09117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.09350: done with get_vars() 34139 1726867644.09361: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:27:24 -0400 (0:00:00.024) 0:00:02.840 ****** 34139 1726867644.09456: entering _queue_task() for managed_node1/include_tasks 34139 1726867644.09699: worker is 1 (out of 1 available) 34139 1726867644.09713: exiting _queue_task() for managed_node1/include_tasks 34139 1726867644.09724: done queuing things up, now waiting for results queue to drain 34139 1726867644.09725: waiting for pending results... 34139 1726867644.10094: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34139 1726867644.10130: in run() - task 0affcac9-a3a5-c103-b8fd-000000000017 34139 1726867644.10151: variable 'ansible_search_path' from source: unknown 34139 1726867644.10158: variable 'ansible_search_path' from source: unknown 34139 1726867644.10201: calling self._execute() 34139 1726867644.10280: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.10295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.10314: variable 'omit' from source: magic vars 34139 1726867644.10673: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.10693: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.10818: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.10835: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.10844: when evaluation is False, skipping this task 34139 1726867644.10853: _execute() done 34139 1726867644.10862: dumping result to json 34139 1726867644.10871: done dumping result, returning 34139 1726867644.10945: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-c103-b8fd-000000000017] 34139 1726867644.10948: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000017 34139 1726867644.11020: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000017 34139 1726867644.11023: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.11097: no more pending results, returning what we have 34139 1726867644.11100: results queue empty 34139 1726867644.11101: checking for any_errors_fatal 34139 1726867644.11110: done checking for any_errors_fatal 34139 1726867644.11110: checking for max_fail_percentage 34139 1726867644.11112: done checking for max_fail_percentage 34139 1726867644.11113: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.11114: done checking to see if all hosts have failed 34139 1726867644.11114: getting the remaining hosts for this loop 34139 1726867644.11116: done getting the remaining hosts for this loop 34139 1726867644.11119: getting the next task for host managed_node1 34139 1726867644.11127: done getting next task for host managed_node1 34139 1726867644.11131: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34139 1726867644.11134: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.11150: getting variables 34139 1726867644.11152: in VariableManager get_vars() 34139 1726867644.11200: Calling all_inventory to load vars for managed_node1 34139 1726867644.11203: Calling groups_inventory to load vars for managed_node1 34139 1726867644.11206: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.11220: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.11223: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.11227: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.11642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.11866: done with get_vars() 34139 1726867644.11876: done getting variables 34139 1726867644.11937: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:27:24 -0400 (0:00:00.025) 0:00:02.865 ****** 34139 1726867644.11967: entering _queue_task() for managed_node1/debug 34139 1726867644.12200: worker is 1 (out of 1 available) 34139 1726867644.12215: exiting _queue_task() for managed_node1/debug 34139 1726867644.12226: done queuing things up, now waiting for results queue to drain 34139 1726867644.12227: waiting for pending results... 34139 1726867644.12483: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 34139 1726867644.12686: in run() - task 0affcac9-a3a5-c103-b8fd-000000000018 34139 1726867644.12690: variable 'ansible_search_path' from source: unknown 34139 1726867644.12694: variable 'ansible_search_path' from source: unknown 34139 1726867644.12697: calling self._execute() 34139 1726867644.12761: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.12773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.12797: variable 'omit' from source: magic vars 34139 1726867644.13164: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.13184: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.13303: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.13336: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.13339: when evaluation is False, skipping this task 34139 1726867644.13341: _execute() done 34139 1726867644.13343: dumping result to json 34139 1726867644.13345: done dumping result, returning 34139 1726867644.13353: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-c103-b8fd-000000000018] 34139 1726867644.13482: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000018 34139 1726867644.13544: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000018 34139 1726867644.13548: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867644.13593: no more pending results, returning what we have 34139 1726867644.13596: results queue empty 34139 1726867644.13597: checking for any_errors_fatal 34139 1726867644.13602: done checking for any_errors_fatal 34139 1726867644.13603: checking for max_fail_percentage 34139 1726867644.13605: done checking for max_fail_percentage 34139 1726867644.13605: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.13606: done checking to see if all hosts have failed 34139 1726867644.13607: getting the remaining hosts for this loop 34139 1726867644.13611: done getting the remaining hosts for this loop 34139 1726867644.13614: getting the next task for host managed_node1 34139 1726867644.13621: done getting next task for host managed_node1 34139 1726867644.13625: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34139 1726867644.13628: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.13642: getting variables 34139 1726867644.13644: in VariableManager get_vars() 34139 1726867644.13692: Calling all_inventory to load vars for managed_node1 34139 1726867644.13695: Calling groups_inventory to load vars for managed_node1 34139 1726867644.13698: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.13711: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.13714: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.13717: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.14010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.14239: done with get_vars() 34139 1726867644.14249: done getting variables 34139 1726867644.14337: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:27:24 -0400 (0:00:00.023) 0:00:02.889 ****** 34139 1726867644.14367: entering _queue_task() for managed_node1/fail 34139 1726867644.14368: Creating lock for fail 34139 1726867644.14603: worker is 1 (out of 1 available) 34139 1726867644.14619: exiting _queue_task() for managed_node1/fail 34139 1726867644.14630: done queuing things up, now waiting for results queue to drain 34139 1726867644.14631: waiting for pending results... 34139 1726867644.14880: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34139 1726867644.15085: in run() - task 0affcac9-a3a5-c103-b8fd-000000000019 34139 1726867644.15089: variable 'ansible_search_path' from source: unknown 34139 1726867644.15092: variable 'ansible_search_path' from source: unknown 34139 1726867644.15095: calling self._execute() 34139 1726867644.15156: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.15168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.15185: variable 'omit' from source: magic vars 34139 1726867644.15560: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.15628: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.15711: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.15724: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.15737: when evaluation is False, skipping this task 34139 1726867644.15746: _execute() done 34139 1726867644.15754: dumping result to json 34139 1726867644.15762: done dumping result, returning 34139 1726867644.15774: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-c103-b8fd-000000000019] 34139 1726867644.15787: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000019 34139 1726867644.15914: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000019 34139 1726867644.15918: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.15996: no more pending results, returning what we have 34139 1726867644.15999: results queue empty 34139 1726867644.16000: checking for any_errors_fatal 34139 1726867644.16006: done checking for any_errors_fatal 34139 1726867644.16007: checking for max_fail_percentage 34139 1726867644.16011: done checking for max_fail_percentage 34139 1726867644.16012: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.16013: done checking to see if all hosts have failed 34139 1726867644.16013: getting the remaining hosts for this loop 34139 1726867644.16014: done getting the remaining hosts for this loop 34139 1726867644.16018: getting the next task for host managed_node1 34139 1726867644.16024: done getting next task for host managed_node1 34139 1726867644.16027: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34139 1726867644.16031: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.16046: getting variables 34139 1726867644.16048: in VariableManager get_vars() 34139 1726867644.16094: Calling all_inventory to load vars for managed_node1 34139 1726867644.16097: Calling groups_inventory to load vars for managed_node1 34139 1726867644.16099: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.16112: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.16115: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.16118: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.16491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.16710: done with get_vars() 34139 1726867644.16719: done getting variables 34139 1726867644.16771: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:27:24 -0400 (0:00:00.024) 0:00:02.914 ****** 34139 1726867644.16802: entering _queue_task() for managed_node1/fail 34139 1726867644.17020: worker is 1 (out of 1 available) 34139 1726867644.17033: exiting _queue_task() for managed_node1/fail 34139 1726867644.17043: done queuing things up, now waiting for results queue to drain 34139 1726867644.17044: waiting for pending results... 34139 1726867644.17434: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34139 1726867644.17439: in run() - task 0affcac9-a3a5-c103-b8fd-00000000001a 34139 1726867644.17442: variable 'ansible_search_path' from source: unknown 34139 1726867644.17445: variable 'ansible_search_path' from source: unknown 34139 1726867644.17455: calling self._execute() 34139 1726867644.17550: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.17624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.17628: variable 'omit' from source: magic vars 34139 1726867644.17963: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.17991: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.18149: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.18190: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.18194: when evaluation is False, skipping this task 34139 1726867644.18197: _execute() done 34139 1726867644.18199: dumping result to json 34139 1726867644.18201: done dumping result, returning 34139 1726867644.18205: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-c103-b8fd-00000000001a] 34139 1726867644.18210: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000001a 34139 1726867644.18269: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000001a skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.18322: no more pending results, returning what we have 34139 1726867644.18325: results queue empty 34139 1726867644.18325: checking for any_errors_fatal 34139 1726867644.18333: done checking for any_errors_fatal 34139 1726867644.18333: checking for max_fail_percentage 34139 1726867644.18335: done checking for max_fail_percentage 34139 1726867644.18336: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.18337: done checking to see if all hosts have failed 34139 1726867644.18337: getting the remaining hosts for this loop 34139 1726867644.18338: done getting the remaining hosts for this loop 34139 1726867644.18341: getting the next task for host managed_node1 34139 1726867644.18346: done getting next task for host managed_node1 34139 1726867644.18349: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34139 1726867644.18352: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.18365: getting variables 34139 1726867644.18367: in VariableManager get_vars() 34139 1726867644.18405: Calling all_inventory to load vars for managed_node1 34139 1726867644.18410: Calling groups_inventory to load vars for managed_node1 34139 1726867644.18412: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.18420: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.18422: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.18425: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.18550: WORKER PROCESS EXITING 34139 1726867644.18560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.18691: done with get_vars() 34139 1726867644.18699: done getting variables 34139 1726867644.18739: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:27:24 -0400 (0:00:00.019) 0:00:02.933 ****** 34139 1726867644.18759: entering _queue_task() for managed_node1/fail 34139 1726867644.18924: worker is 1 (out of 1 available) 34139 1726867644.18936: exiting _queue_task() for managed_node1/fail 34139 1726867644.18947: done queuing things up, now waiting for results queue to drain 34139 1726867644.18949: waiting for pending results... 34139 1726867644.19106: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34139 1726867644.19185: in run() - task 0affcac9-a3a5-c103-b8fd-00000000001b 34139 1726867644.19193: variable 'ansible_search_path' from source: unknown 34139 1726867644.19196: variable 'ansible_search_path' from source: unknown 34139 1726867644.19228: calling self._execute() 34139 1726867644.19292: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.19295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.19305: variable 'omit' from source: magic vars 34139 1726867644.19627: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.19637: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.19716: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.19721: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.19724: when evaluation is False, skipping this task 34139 1726867644.19727: _execute() done 34139 1726867644.19729: dumping result to json 34139 1726867644.19733: done dumping result, returning 34139 1726867644.19744: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-c103-b8fd-00000000001b] 34139 1726867644.19747: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000001b 34139 1726867644.19823: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000001b 34139 1726867644.19826: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.19901: no more pending results, returning what we have 34139 1726867644.19904: results queue empty 34139 1726867644.19904: checking for any_errors_fatal 34139 1726867644.19908: done checking for any_errors_fatal 34139 1726867644.19909: checking for max_fail_percentage 34139 1726867644.19910: done checking for max_fail_percentage 34139 1726867644.19911: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.19912: done checking to see if all hosts have failed 34139 1726867644.19913: getting the remaining hosts for this loop 34139 1726867644.19914: done getting the remaining hosts for this loop 34139 1726867644.19916: getting the next task for host managed_node1 34139 1726867644.19921: done getting next task for host managed_node1 34139 1726867644.19924: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34139 1726867644.19927: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.19938: getting variables 34139 1726867644.19940: in VariableManager get_vars() 34139 1726867644.19976: Calling all_inventory to load vars for managed_node1 34139 1726867644.19982: Calling groups_inventory to load vars for managed_node1 34139 1726867644.19984: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.20022: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.20025: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.20028: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.20249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.20462: done with get_vars() 34139 1726867644.20471: done getting variables 34139 1726867644.20567: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:27:24 -0400 (0:00:00.018) 0:00:02.952 ****** 34139 1726867644.20595: entering _queue_task() for managed_node1/dnf 34139 1726867644.20804: worker is 1 (out of 1 available) 34139 1726867644.20821: exiting _queue_task() for managed_node1/dnf 34139 1726867644.20831: done queuing things up, now waiting for results queue to drain 34139 1726867644.20833: waiting for pending results... 34139 1726867644.21109: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34139 1726867644.21153: in run() - task 0affcac9-a3a5-c103-b8fd-00000000001c 34139 1726867644.21201: variable 'ansible_search_path' from source: unknown 34139 1726867644.21216: variable 'ansible_search_path' from source: unknown 34139 1726867644.21257: calling self._execute() 34139 1726867644.21350: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.21361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.21386: variable 'omit' from source: magic vars 34139 1726867644.21769: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.21790: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.21900: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.21904: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.21907: when evaluation is False, skipping this task 34139 1726867644.21914: _execute() done 34139 1726867644.21925: dumping result to json 34139 1726867644.21928: done dumping result, returning 34139 1726867644.21935: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-c103-b8fd-00000000001c] 34139 1726867644.21940: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000001c skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.22075: no more pending results, returning what we have 34139 1726867644.22086: results queue empty 34139 1726867644.22087: checking for any_errors_fatal 34139 1726867644.22094: done checking for any_errors_fatal 34139 1726867644.22095: checking for max_fail_percentage 34139 1726867644.22096: done checking for max_fail_percentage 34139 1726867644.22097: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.22098: done checking to see if all hosts have failed 34139 1726867644.22098: getting the remaining hosts for this loop 34139 1726867644.22099: done getting the remaining hosts for this loop 34139 1726867644.22103: getting the next task for host managed_node1 34139 1726867644.22108: done getting next task for host managed_node1 34139 1726867644.22111: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34139 1726867644.22113: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.22126: getting variables 34139 1726867644.22128: in VariableManager get_vars() 34139 1726867644.22169: Calling all_inventory to load vars for managed_node1 34139 1726867644.22171: Calling groups_inventory to load vars for managed_node1 34139 1726867644.22174: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.22184: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.22186: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.22189: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.22303: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000001c 34139 1726867644.22307: WORKER PROCESS EXITING 34139 1726867644.22319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.22473: done with get_vars() 34139 1726867644.22482: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34139 1726867644.22532: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:27:24 -0400 (0:00:00.019) 0:00:02.971 ****** 34139 1726867644.22551: entering _queue_task() for managed_node1/yum 34139 1726867644.22552: Creating lock for yum 34139 1726867644.22740: worker is 1 (out of 1 available) 34139 1726867644.22753: exiting _queue_task() for managed_node1/yum 34139 1726867644.22764: done queuing things up, now waiting for results queue to drain 34139 1726867644.22766: waiting for pending results... 34139 1726867644.22922: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34139 1726867644.22995: in run() - task 0affcac9-a3a5-c103-b8fd-00000000001d 34139 1726867644.23007: variable 'ansible_search_path' from source: unknown 34139 1726867644.23010: variable 'ansible_search_path' from source: unknown 34139 1726867644.23041: calling self._execute() 34139 1726867644.23102: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.23105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.23116: variable 'omit' from source: magic vars 34139 1726867644.23380: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.23389: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.23468: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.23472: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.23475: when evaluation is False, skipping this task 34139 1726867644.23480: _execute() done 34139 1726867644.23483: dumping result to json 34139 1726867644.23486: done dumping result, returning 34139 1726867644.23493: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-c103-b8fd-00000000001d] 34139 1726867644.23498: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000001d skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.23636: no more pending results, returning what we have 34139 1726867644.23638: results queue empty 34139 1726867644.23639: checking for any_errors_fatal 34139 1726867644.23644: done checking for any_errors_fatal 34139 1726867644.23644: checking for max_fail_percentage 34139 1726867644.23646: done checking for max_fail_percentage 34139 1726867644.23646: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.23647: done checking to see if all hosts have failed 34139 1726867644.23648: getting the remaining hosts for this loop 34139 1726867644.23649: done getting the remaining hosts for this loop 34139 1726867644.23652: getting the next task for host managed_node1 34139 1726867644.23657: done getting next task for host managed_node1 34139 1726867644.23660: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34139 1726867644.23663: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.23675: getting variables 34139 1726867644.23676: in VariableManager get_vars() 34139 1726867644.23726: Calling all_inventory to load vars for managed_node1 34139 1726867644.23729: Calling groups_inventory to load vars for managed_node1 34139 1726867644.23731: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.23739: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.23742: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.23744: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.23957: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000001d 34139 1726867644.23961: WORKER PROCESS EXITING 34139 1726867644.23983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.24206: done with get_vars() 34139 1726867644.24219: done getting variables 34139 1726867644.24273: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:27:24 -0400 (0:00:00.017) 0:00:02.989 ****** 34139 1726867644.24303: entering _queue_task() for managed_node1/fail 34139 1726867644.24524: worker is 1 (out of 1 available) 34139 1726867644.24536: exiting _queue_task() for managed_node1/fail 34139 1726867644.24547: done queuing things up, now waiting for results queue to drain 34139 1726867644.24549: waiting for pending results... 34139 1726867644.24820: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34139 1726867644.24920: in run() - task 0affcac9-a3a5-c103-b8fd-00000000001e 34139 1726867644.24930: variable 'ansible_search_path' from source: unknown 34139 1726867644.24933: variable 'ansible_search_path' from source: unknown 34139 1726867644.24960: calling self._execute() 34139 1726867644.25022: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.25026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.25035: variable 'omit' from source: magic vars 34139 1726867644.25296: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.25306: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.25384: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.25387: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.25392: when evaluation is False, skipping this task 34139 1726867644.25397: _execute() done 34139 1726867644.25399: dumping result to json 34139 1726867644.25402: done dumping result, returning 34139 1726867644.25408: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-c103-b8fd-00000000001e] 34139 1726867644.25418: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000001e 34139 1726867644.25504: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000001e 34139 1726867644.25507: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.25556: no more pending results, returning what we have 34139 1726867644.25559: results queue empty 34139 1726867644.25560: checking for any_errors_fatal 34139 1726867644.25565: done checking for any_errors_fatal 34139 1726867644.25566: checking for max_fail_percentage 34139 1726867644.25567: done checking for max_fail_percentage 34139 1726867644.25568: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.25568: done checking to see if all hosts have failed 34139 1726867644.25569: getting the remaining hosts for this loop 34139 1726867644.25570: done getting the remaining hosts for this loop 34139 1726867644.25573: getting the next task for host managed_node1 34139 1726867644.25580: done getting next task for host managed_node1 34139 1726867644.25583: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34139 1726867644.25585: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.25597: getting variables 34139 1726867644.25598: in VariableManager get_vars() 34139 1726867644.25634: Calling all_inventory to load vars for managed_node1 34139 1726867644.25636: Calling groups_inventory to load vars for managed_node1 34139 1726867644.25639: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.25645: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.25647: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.25649: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.25789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.25915: done with get_vars() 34139 1726867644.25922: done getting variables 34139 1726867644.25959: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:27:24 -0400 (0:00:00.016) 0:00:03.005 ****** 34139 1726867644.25983: entering _queue_task() for managed_node1/package 34139 1726867644.26155: worker is 1 (out of 1 available) 34139 1726867644.26168: exiting _queue_task() for managed_node1/package 34139 1726867644.26181: done queuing things up, now waiting for results queue to drain 34139 1726867644.26183: waiting for pending results... 34139 1726867644.26322: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 34139 1726867644.26396: in run() - task 0affcac9-a3a5-c103-b8fd-00000000001f 34139 1726867644.26407: variable 'ansible_search_path' from source: unknown 34139 1726867644.26410: variable 'ansible_search_path' from source: unknown 34139 1726867644.26441: calling self._execute() 34139 1726867644.26496: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.26499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.26509: variable 'omit' from source: magic vars 34139 1726867644.26761: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.26769: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.26845: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.26850: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.26854: when evaluation is False, skipping this task 34139 1726867644.26856: _execute() done 34139 1726867644.26859: dumping result to json 34139 1726867644.26861: done dumping result, returning 34139 1726867644.26872: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-c103-b8fd-00000000001f] 34139 1726867644.26875: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000001f 34139 1726867644.26957: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000001f 34139 1726867644.26959: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.27016: no more pending results, returning what we have 34139 1726867644.27019: results queue empty 34139 1726867644.27019: checking for any_errors_fatal 34139 1726867644.27024: done checking for any_errors_fatal 34139 1726867644.27024: checking for max_fail_percentage 34139 1726867644.27026: done checking for max_fail_percentage 34139 1726867644.27026: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.27027: done checking to see if all hosts have failed 34139 1726867644.27028: getting the remaining hosts for this loop 34139 1726867644.27029: done getting the remaining hosts for this loop 34139 1726867644.27031: getting the next task for host managed_node1 34139 1726867644.27036: done getting next task for host managed_node1 34139 1726867644.27039: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34139 1726867644.27041: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.27053: getting variables 34139 1726867644.27054: in VariableManager get_vars() 34139 1726867644.27088: Calling all_inventory to load vars for managed_node1 34139 1726867644.27089: Calling groups_inventory to load vars for managed_node1 34139 1726867644.27091: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.27097: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.27098: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.27100: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.27208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.27336: done with get_vars() 34139 1726867644.27343: done getting variables 34139 1726867644.27384: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:27:24 -0400 (0:00:00.014) 0:00:03.020 ****** 34139 1726867644.27406: entering _queue_task() for managed_node1/package 34139 1726867644.27572: worker is 1 (out of 1 available) 34139 1726867644.27586: exiting _queue_task() for managed_node1/package 34139 1726867644.27599: done queuing things up, now waiting for results queue to drain 34139 1726867644.27600: waiting for pending results... 34139 1726867644.27741: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34139 1726867644.27815: in run() - task 0affcac9-a3a5-c103-b8fd-000000000020 34139 1726867644.27835: variable 'ansible_search_path' from source: unknown 34139 1726867644.27838: variable 'ansible_search_path' from source: unknown 34139 1726867644.27858: calling self._execute() 34139 1726867644.27910: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.27918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.27926: variable 'omit' from source: magic vars 34139 1726867644.28217: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.28225: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.28302: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.28306: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.28308: when evaluation is False, skipping this task 34139 1726867644.28315: _execute() done 34139 1726867644.28318: dumping result to json 34139 1726867644.28320: done dumping result, returning 34139 1726867644.28327: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-c103-b8fd-000000000020] 34139 1726867644.28331: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000020 34139 1726867644.28415: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000020 34139 1726867644.28417: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.28459: no more pending results, returning what we have 34139 1726867644.28461: results queue empty 34139 1726867644.28462: checking for any_errors_fatal 34139 1726867644.28466: done checking for any_errors_fatal 34139 1726867644.28467: checking for max_fail_percentage 34139 1726867644.28468: done checking for max_fail_percentage 34139 1726867644.28469: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.28469: done checking to see if all hosts have failed 34139 1726867644.28470: getting the remaining hosts for this loop 34139 1726867644.28471: done getting the remaining hosts for this loop 34139 1726867644.28474: getting the next task for host managed_node1 34139 1726867644.28480: done getting next task for host managed_node1 34139 1726867644.28483: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34139 1726867644.28486: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.28497: getting variables 34139 1726867644.28498: in VariableManager get_vars() 34139 1726867644.28533: Calling all_inventory to load vars for managed_node1 34139 1726867644.28535: Calling groups_inventory to load vars for managed_node1 34139 1726867644.28537: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.28543: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.28544: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.28546: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.28686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.28811: done with get_vars() 34139 1726867644.28818: done getting variables 34139 1726867644.28855: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:27:24 -0400 (0:00:00.014) 0:00:03.034 ****** 34139 1726867644.28876: entering _queue_task() for managed_node1/package 34139 1726867644.29040: worker is 1 (out of 1 available) 34139 1726867644.29056: exiting _queue_task() for managed_node1/package 34139 1726867644.29066: done queuing things up, now waiting for results queue to drain 34139 1726867644.29068: waiting for pending results... 34139 1726867644.29227: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34139 1726867644.29310: in run() - task 0affcac9-a3a5-c103-b8fd-000000000021 34139 1726867644.29321: variable 'ansible_search_path' from source: unknown 34139 1726867644.29325: variable 'ansible_search_path' from source: unknown 34139 1726867644.29350: calling self._execute() 34139 1726867644.29410: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.29414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.29425: variable 'omit' from source: magic vars 34139 1726867644.29684: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.29694: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.29771: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.29775: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.29779: when evaluation is False, skipping this task 34139 1726867644.29784: _execute() done 34139 1726867644.29787: dumping result to json 34139 1726867644.29789: done dumping result, returning 34139 1726867644.29797: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-c103-b8fd-000000000021] 34139 1726867644.29801: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000021 34139 1726867644.29884: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000021 34139 1726867644.29887: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.29931: no more pending results, returning what we have 34139 1726867644.29934: results queue empty 34139 1726867644.29935: checking for any_errors_fatal 34139 1726867644.29940: done checking for any_errors_fatal 34139 1726867644.29941: checking for max_fail_percentage 34139 1726867644.29942: done checking for max_fail_percentage 34139 1726867644.29943: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.29943: done checking to see if all hosts have failed 34139 1726867644.29944: getting the remaining hosts for this loop 34139 1726867644.29945: done getting the remaining hosts for this loop 34139 1726867644.29949: getting the next task for host managed_node1 34139 1726867644.29954: done getting next task for host managed_node1 34139 1726867644.29957: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34139 1726867644.29959: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.29971: getting variables 34139 1726867644.29973: in VariableManager get_vars() 34139 1726867644.30018: Calling all_inventory to load vars for managed_node1 34139 1726867644.30021: Calling groups_inventory to load vars for managed_node1 34139 1726867644.30023: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.30029: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.30031: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.30033: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.30146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.30275: done with get_vars() 34139 1726867644.30283: done getting variables 34139 1726867644.30351: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:27:24 -0400 (0:00:00.014) 0:00:03.049 ****** 34139 1726867644.30371: entering _queue_task() for managed_node1/service 34139 1726867644.30373: Creating lock for service 34139 1726867644.30544: worker is 1 (out of 1 available) 34139 1726867644.30556: exiting _queue_task() for managed_node1/service 34139 1726867644.30567: done queuing things up, now waiting for results queue to drain 34139 1726867644.30568: waiting for pending results... 34139 1726867644.30712: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34139 1726867644.30781: in run() - task 0affcac9-a3a5-c103-b8fd-000000000022 34139 1726867644.30797: variable 'ansible_search_path' from source: unknown 34139 1726867644.30801: variable 'ansible_search_path' from source: unknown 34139 1726867644.30823: calling self._execute() 34139 1726867644.30932: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.30935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.30944: variable 'omit' from source: magic vars 34139 1726867644.31170: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.31180: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.31256: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.31260: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.31263: when evaluation is False, skipping this task 34139 1726867644.31267: _execute() done 34139 1726867644.31270: dumping result to json 34139 1726867644.31272: done dumping result, returning 34139 1726867644.31281: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-c103-b8fd-000000000022] 34139 1726867644.31286: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000022 34139 1726867644.31368: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000022 34139 1726867644.31371: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.31470: no more pending results, returning what we have 34139 1726867644.31472: results queue empty 34139 1726867644.31473: checking for any_errors_fatal 34139 1726867644.31481: done checking for any_errors_fatal 34139 1726867644.31482: checking for max_fail_percentage 34139 1726867644.31483: done checking for max_fail_percentage 34139 1726867644.31484: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.31484: done checking to see if all hosts have failed 34139 1726867644.31484: getting the remaining hosts for this loop 34139 1726867644.31485: done getting the remaining hosts for this loop 34139 1726867644.31487: getting the next task for host managed_node1 34139 1726867644.31491: done getting next task for host managed_node1 34139 1726867644.31493: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34139 1726867644.31494: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.31502: getting variables 34139 1726867644.31503: in VariableManager get_vars() 34139 1726867644.31527: Calling all_inventory to load vars for managed_node1 34139 1726867644.31529: Calling groups_inventory to load vars for managed_node1 34139 1726867644.31530: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.31535: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.31537: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.31539: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.31640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.31765: done with get_vars() 34139 1726867644.31771: done getting variables 34139 1726867644.31813: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:27:24 -0400 (0:00:00.014) 0:00:03.064 ****** 34139 1726867644.31832: entering _queue_task() for managed_node1/service 34139 1726867644.31992: worker is 1 (out of 1 available) 34139 1726867644.32003: exiting _queue_task() for managed_node1/service 34139 1726867644.32016: done queuing things up, now waiting for results queue to drain 34139 1726867644.32018: waiting for pending results... 34139 1726867644.32153: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34139 1726867644.32221: in run() - task 0affcac9-a3a5-c103-b8fd-000000000023 34139 1726867644.32231: variable 'ansible_search_path' from source: unknown 34139 1726867644.32234: variable 'ansible_search_path' from source: unknown 34139 1726867644.32261: calling self._execute() 34139 1726867644.32314: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.32317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.32323: variable 'omit' from source: magic vars 34139 1726867644.32552: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.32560: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.32636: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.32640: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.32642: when evaluation is False, skipping this task 34139 1726867644.32645: _execute() done 34139 1726867644.32648: dumping result to json 34139 1726867644.32650: done dumping result, returning 34139 1726867644.32657: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-c103-b8fd-000000000023] 34139 1726867644.32661: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000023 34139 1726867644.32746: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000023 34139 1726867644.32749: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34139 1726867644.32811: no more pending results, returning what we have 34139 1726867644.32813: results queue empty 34139 1726867644.32814: checking for any_errors_fatal 34139 1726867644.32818: done checking for any_errors_fatal 34139 1726867644.32819: checking for max_fail_percentage 34139 1726867644.32820: done checking for max_fail_percentage 34139 1726867644.32821: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.32821: done checking to see if all hosts have failed 34139 1726867644.32822: getting the remaining hosts for this loop 34139 1726867644.32823: done getting the remaining hosts for this loop 34139 1726867644.32826: getting the next task for host managed_node1 34139 1726867644.32830: done getting next task for host managed_node1 34139 1726867644.32833: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34139 1726867644.32836: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.32844: getting variables 34139 1726867644.32845: in VariableManager get_vars() 34139 1726867644.32871: Calling all_inventory to load vars for managed_node1 34139 1726867644.32873: Calling groups_inventory to load vars for managed_node1 34139 1726867644.32874: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.32882: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.32883: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.32885: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.33018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.33144: done with get_vars() 34139 1726867644.33150: done getting variables 34139 1726867644.33190: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:27:24 -0400 (0:00:00.013) 0:00:03.078 ****** 34139 1726867644.33210: entering _queue_task() for managed_node1/service 34139 1726867644.33371: worker is 1 (out of 1 available) 34139 1726867644.33386: exiting _queue_task() for managed_node1/service 34139 1726867644.33396: done queuing things up, now waiting for results queue to drain 34139 1726867644.33398: waiting for pending results... 34139 1726867644.33534: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34139 1726867644.33601: in run() - task 0affcac9-a3a5-c103-b8fd-000000000024 34139 1726867644.33614: variable 'ansible_search_path' from source: unknown 34139 1726867644.33617: variable 'ansible_search_path' from source: unknown 34139 1726867644.33643: calling self._execute() 34139 1726867644.33694: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.33697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.33706: variable 'omit' from source: magic vars 34139 1726867644.33933: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.33942: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.34018: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.34021: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.34024: when evaluation is False, skipping this task 34139 1726867644.34028: _execute() done 34139 1726867644.34031: dumping result to json 34139 1726867644.34035: done dumping result, returning 34139 1726867644.34041: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-c103-b8fd-000000000024] 34139 1726867644.34046: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000024 34139 1726867644.34130: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000024 34139 1726867644.34133: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.34196: no more pending results, returning what we have 34139 1726867644.34198: results queue empty 34139 1726867644.34199: checking for any_errors_fatal 34139 1726867644.34204: done checking for any_errors_fatal 34139 1726867644.34204: checking for max_fail_percentage 34139 1726867644.34206: done checking for max_fail_percentage 34139 1726867644.34206: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.34209: done checking to see if all hosts have failed 34139 1726867644.34210: getting the remaining hosts for this loop 34139 1726867644.34211: done getting the remaining hosts for this loop 34139 1726867644.34214: getting the next task for host managed_node1 34139 1726867644.34219: done getting next task for host managed_node1 34139 1726867644.34222: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34139 1726867644.34224: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.34232: getting variables 34139 1726867644.34233: in VariableManager get_vars() 34139 1726867644.34259: Calling all_inventory to load vars for managed_node1 34139 1726867644.34261: Calling groups_inventory to load vars for managed_node1 34139 1726867644.34262: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.34267: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.34269: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.34270: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.34382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.34510: done with get_vars() 34139 1726867644.34517: done getting variables 34139 1726867644.34555: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:27:24 -0400 (0:00:00.013) 0:00:03.091 ****** 34139 1726867644.34573: entering _queue_task() for managed_node1/service 34139 1726867644.34735: worker is 1 (out of 1 available) 34139 1726867644.34748: exiting _queue_task() for managed_node1/service 34139 1726867644.34758: done queuing things up, now waiting for results queue to drain 34139 1726867644.34759: waiting for pending results... 34139 1726867644.34892: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 34139 1726867644.34962: in run() - task 0affcac9-a3a5-c103-b8fd-000000000025 34139 1726867644.34971: variable 'ansible_search_path' from source: unknown 34139 1726867644.34974: variable 'ansible_search_path' from source: unknown 34139 1726867644.35004: calling self._execute() 34139 1726867644.35053: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.35056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.35065: variable 'omit' from source: magic vars 34139 1726867644.35294: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.35305: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.35379: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.35384: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.35386: when evaluation is False, skipping this task 34139 1726867644.35389: _execute() done 34139 1726867644.35392: dumping result to json 34139 1726867644.35396: done dumping result, returning 34139 1726867644.35402: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-c103-b8fd-000000000025] 34139 1726867644.35410: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000025 34139 1726867644.35491: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000025 34139 1726867644.35493: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34139 1726867644.35559: no more pending results, returning what we have 34139 1726867644.35562: results queue empty 34139 1726867644.35562: checking for any_errors_fatal 34139 1726867644.35567: done checking for any_errors_fatal 34139 1726867644.35567: checking for max_fail_percentage 34139 1726867644.35569: done checking for max_fail_percentage 34139 1726867644.35569: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.35570: done checking to see if all hosts have failed 34139 1726867644.35571: getting the remaining hosts for this loop 34139 1726867644.35572: done getting the remaining hosts for this loop 34139 1726867644.35574: getting the next task for host managed_node1 34139 1726867644.35582: done getting next task for host managed_node1 34139 1726867644.35584: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34139 1726867644.35586: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.35595: getting variables 34139 1726867644.35596: in VariableManager get_vars() 34139 1726867644.35624: Calling all_inventory to load vars for managed_node1 34139 1726867644.35626: Calling groups_inventory to load vars for managed_node1 34139 1726867644.35628: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.35633: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.35634: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.35638: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.35773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.35906: done with get_vars() 34139 1726867644.35914: done getting variables 34139 1726867644.35951: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:27:24 -0400 (0:00:00.013) 0:00:03.105 ****** 34139 1726867644.35972: entering _queue_task() for managed_node1/copy 34139 1726867644.36132: worker is 1 (out of 1 available) 34139 1726867644.36144: exiting _queue_task() for managed_node1/copy 34139 1726867644.36154: done queuing things up, now waiting for results queue to drain 34139 1726867644.36156: waiting for pending results... 34139 1726867644.36313: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34139 1726867644.36389: in run() - task 0affcac9-a3a5-c103-b8fd-000000000026 34139 1726867644.36402: variable 'ansible_search_path' from source: unknown 34139 1726867644.36406: variable 'ansible_search_path' from source: unknown 34139 1726867644.36431: calling self._execute() 34139 1726867644.36486: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.36492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.36501: variable 'omit' from source: magic vars 34139 1726867644.36758: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.36767: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.36849: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.36853: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.36856: when evaluation is False, skipping this task 34139 1726867644.36859: _execute() done 34139 1726867644.36862: dumping result to json 34139 1726867644.36865: done dumping result, returning 34139 1726867644.36874: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-c103-b8fd-000000000026] 34139 1726867644.36878: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000026 34139 1726867644.36964: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000026 34139 1726867644.36967: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.37012: no more pending results, returning what we have 34139 1726867644.37014: results queue empty 34139 1726867644.37015: checking for any_errors_fatal 34139 1726867644.37019: done checking for any_errors_fatal 34139 1726867644.37020: checking for max_fail_percentage 34139 1726867644.37021: done checking for max_fail_percentage 34139 1726867644.37022: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.37023: done checking to see if all hosts have failed 34139 1726867644.37023: getting the remaining hosts for this loop 34139 1726867644.37025: done getting the remaining hosts for this loop 34139 1726867644.37027: getting the next task for host managed_node1 34139 1726867644.37033: done getting next task for host managed_node1 34139 1726867644.37035: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34139 1726867644.37038: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.37050: getting variables 34139 1726867644.37051: in VariableManager get_vars() 34139 1726867644.37090: Calling all_inventory to load vars for managed_node1 34139 1726867644.37093: Calling groups_inventory to load vars for managed_node1 34139 1726867644.37095: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.37102: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.37104: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.37106: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.37216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.37343: done with get_vars() 34139 1726867644.37350: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:27:24 -0400 (0:00:00.014) 0:00:03.120 ****** 34139 1726867644.37407: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34139 1726867644.37408: Creating lock for fedora.linux_system_roles.network_connections 34139 1726867644.37580: worker is 1 (out of 1 available) 34139 1726867644.37592: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34139 1726867644.37603: done queuing things up, now waiting for results queue to drain 34139 1726867644.37605: waiting for pending results... 34139 1726867644.37752: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34139 1726867644.37822: in run() - task 0affcac9-a3a5-c103-b8fd-000000000027 34139 1726867644.37836: variable 'ansible_search_path' from source: unknown 34139 1726867644.37843: variable 'ansible_search_path' from source: unknown 34139 1726867644.37869: calling self._execute() 34139 1726867644.37928: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.37931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.37943: variable 'omit' from source: magic vars 34139 1726867644.38241: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.38250: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.38330: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.38334: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.38336: when evaluation is False, skipping this task 34139 1726867644.38339: _execute() done 34139 1726867644.38341: dumping result to json 34139 1726867644.38346: done dumping result, returning 34139 1726867644.38353: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-c103-b8fd-000000000027] 34139 1726867644.38358: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000027 34139 1726867644.38444: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000027 34139 1726867644.38447: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.38520: no more pending results, returning what we have 34139 1726867644.38523: results queue empty 34139 1726867644.38524: checking for any_errors_fatal 34139 1726867644.38528: done checking for any_errors_fatal 34139 1726867644.38529: checking for max_fail_percentage 34139 1726867644.38530: done checking for max_fail_percentage 34139 1726867644.38531: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.38531: done checking to see if all hosts have failed 34139 1726867644.38532: getting the remaining hosts for this loop 34139 1726867644.38533: done getting the remaining hosts for this loop 34139 1726867644.38536: getting the next task for host managed_node1 34139 1726867644.38540: done getting next task for host managed_node1 34139 1726867644.38543: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34139 1726867644.38545: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.38554: getting variables 34139 1726867644.38555: in VariableManager get_vars() 34139 1726867644.38583: Calling all_inventory to load vars for managed_node1 34139 1726867644.38585: Calling groups_inventory to load vars for managed_node1 34139 1726867644.38588: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.38594: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.38596: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.38597: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.38733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.38858: done with get_vars() 34139 1726867644.38865: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:27:24 -0400 (0:00:00.015) 0:00:03.135 ****** 34139 1726867644.38921: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34139 1726867644.38922: Creating lock for fedora.linux_system_roles.network_state 34139 1726867644.39093: worker is 1 (out of 1 available) 34139 1726867644.39105: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34139 1726867644.39118: done queuing things up, now waiting for results queue to drain 34139 1726867644.39119: waiting for pending results... 34139 1726867644.39254: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 34139 1726867644.39324: in run() - task 0affcac9-a3a5-c103-b8fd-000000000028 34139 1726867644.39335: variable 'ansible_search_path' from source: unknown 34139 1726867644.39339: variable 'ansible_search_path' from source: unknown 34139 1726867644.39365: calling self._execute() 34139 1726867644.39414: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.39417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.39426: variable 'omit' from source: magic vars 34139 1726867644.39655: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.39664: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.39740: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.39743: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.39746: when evaluation is False, skipping this task 34139 1726867644.39749: _execute() done 34139 1726867644.39752: dumping result to json 34139 1726867644.39757: done dumping result, returning 34139 1726867644.39763: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-c103-b8fd-000000000028] 34139 1726867644.39767: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000028 34139 1726867644.39847: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000028 34139 1726867644.39850: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.39925: no more pending results, returning what we have 34139 1726867644.39928: results queue empty 34139 1726867644.39929: checking for any_errors_fatal 34139 1726867644.39933: done checking for any_errors_fatal 34139 1726867644.39933: checking for max_fail_percentage 34139 1726867644.39935: done checking for max_fail_percentage 34139 1726867644.39935: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.39936: done checking to see if all hosts have failed 34139 1726867644.39936: getting the remaining hosts for this loop 34139 1726867644.39937: done getting the remaining hosts for this loop 34139 1726867644.39939: getting the next task for host managed_node1 34139 1726867644.39943: done getting next task for host managed_node1 34139 1726867644.39945: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34139 1726867644.39947: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.39963: getting variables 34139 1726867644.39964: in VariableManager get_vars() 34139 1726867644.40006: Calling all_inventory to load vars for managed_node1 34139 1726867644.40011: Calling groups_inventory to load vars for managed_node1 34139 1726867644.40013: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.40019: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.40020: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.40022: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.40130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.40256: done with get_vars() 34139 1726867644.40263: done getting variables 34139 1726867644.40301: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:27:24 -0400 (0:00:00.014) 0:00:03.149 ****** 34139 1726867644.40323: entering _queue_task() for managed_node1/debug 34139 1726867644.40483: worker is 1 (out of 1 available) 34139 1726867644.40495: exiting _queue_task() for managed_node1/debug 34139 1726867644.40505: done queuing things up, now waiting for results queue to drain 34139 1726867644.40507: waiting for pending results... 34139 1726867644.40644: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34139 1726867644.40704: in run() - task 0affcac9-a3a5-c103-b8fd-000000000029 34139 1726867644.40715: variable 'ansible_search_path' from source: unknown 34139 1726867644.40719: variable 'ansible_search_path' from source: unknown 34139 1726867644.40745: calling self._execute() 34139 1726867644.40794: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.40797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.40806: variable 'omit' from source: magic vars 34139 1726867644.41079: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.41088: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.41160: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.41164: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.41169: when evaluation is False, skipping this task 34139 1726867644.41171: _execute() done 34139 1726867644.41174: dumping result to json 34139 1726867644.41176: done dumping result, returning 34139 1726867644.41191: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-c103-b8fd-000000000029] 34139 1726867644.41193: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000029 34139 1726867644.41271: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000029 34139 1726867644.41274: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867644.41333: no more pending results, returning what we have 34139 1726867644.41335: results queue empty 34139 1726867644.41336: checking for any_errors_fatal 34139 1726867644.41340: done checking for any_errors_fatal 34139 1726867644.41340: checking for max_fail_percentage 34139 1726867644.41342: done checking for max_fail_percentage 34139 1726867644.41342: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.41343: done checking to see if all hosts have failed 34139 1726867644.41344: getting the remaining hosts for this loop 34139 1726867644.41345: done getting the remaining hosts for this loop 34139 1726867644.41348: getting the next task for host managed_node1 34139 1726867644.41352: done getting next task for host managed_node1 34139 1726867644.41355: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34139 1726867644.41357: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.41368: getting variables 34139 1726867644.41370: in VariableManager get_vars() 34139 1726867644.41463: Calling all_inventory to load vars for managed_node1 34139 1726867644.41466: Calling groups_inventory to load vars for managed_node1 34139 1726867644.41469: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.41480: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.41483: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.41486: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.41650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.41881: done with get_vars() 34139 1726867644.41892: done getting variables 34139 1726867644.41952: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:27:24 -0400 (0:00:00.016) 0:00:03.165 ****** 34139 1726867644.41980: entering _queue_task() for managed_node1/debug 34139 1726867644.42249: worker is 1 (out of 1 available) 34139 1726867644.42261: exiting _queue_task() for managed_node1/debug 34139 1726867644.42272: done queuing things up, now waiting for results queue to drain 34139 1726867644.42273: waiting for pending results... 34139 1726867644.42510: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34139 1726867644.42590: in run() - task 0affcac9-a3a5-c103-b8fd-00000000002a 34139 1726867644.42601: variable 'ansible_search_path' from source: unknown 34139 1726867644.42605: variable 'ansible_search_path' from source: unknown 34139 1726867644.42634: calling self._execute() 34139 1726867644.42692: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.42696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.42704: variable 'omit' from source: magic vars 34139 1726867644.42961: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.42970: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.43049: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.43053: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.43058: when evaluation is False, skipping this task 34139 1726867644.43061: _execute() done 34139 1726867644.43063: dumping result to json 34139 1726867644.43065: done dumping result, returning 34139 1726867644.43073: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-c103-b8fd-00000000002a] 34139 1726867644.43079: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000002a 34139 1726867644.43159: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000002a 34139 1726867644.43162: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867644.43217: no more pending results, returning what we have 34139 1726867644.43220: results queue empty 34139 1726867644.43221: checking for any_errors_fatal 34139 1726867644.43225: done checking for any_errors_fatal 34139 1726867644.43226: checking for max_fail_percentage 34139 1726867644.43227: done checking for max_fail_percentage 34139 1726867644.43228: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.43229: done checking to see if all hosts have failed 34139 1726867644.43229: getting the remaining hosts for this loop 34139 1726867644.43230: done getting the remaining hosts for this loop 34139 1726867644.43233: getting the next task for host managed_node1 34139 1726867644.43238: done getting next task for host managed_node1 34139 1726867644.43241: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34139 1726867644.43243: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.43255: getting variables 34139 1726867644.43257: in VariableManager get_vars() 34139 1726867644.43295: Calling all_inventory to load vars for managed_node1 34139 1726867644.43297: Calling groups_inventory to load vars for managed_node1 34139 1726867644.43298: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.43304: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.43305: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.43307: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.43416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.43692: done with get_vars() 34139 1726867644.43698: done getting variables 34139 1726867644.43737: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:27:24 -0400 (0:00:00.017) 0:00:03.183 ****** 34139 1726867644.43756: entering _queue_task() for managed_node1/debug 34139 1726867644.43915: worker is 1 (out of 1 available) 34139 1726867644.43927: exiting _queue_task() for managed_node1/debug 34139 1726867644.43938: done queuing things up, now waiting for results queue to drain 34139 1726867644.43939: waiting for pending results... 34139 1726867644.44093: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34139 1726867644.44172: in run() - task 0affcac9-a3a5-c103-b8fd-00000000002b 34139 1726867644.44184: variable 'ansible_search_path' from source: unknown 34139 1726867644.44187: variable 'ansible_search_path' from source: unknown 34139 1726867644.44212: calling self._execute() 34139 1726867644.44483: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.44486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.44489: variable 'omit' from source: magic vars 34139 1726867644.44641: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.44658: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.44770: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.44785: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.44794: when evaluation is False, skipping this task 34139 1726867644.44802: _execute() done 34139 1726867644.44812: dumping result to json 34139 1726867644.44821: done dumping result, returning 34139 1726867644.44832: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-c103-b8fd-00000000002b] 34139 1726867644.44841: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000002b skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867644.44988: no more pending results, returning what we have 34139 1726867644.44991: results queue empty 34139 1726867644.44991: checking for any_errors_fatal 34139 1726867644.45000: done checking for any_errors_fatal 34139 1726867644.45000: checking for max_fail_percentage 34139 1726867644.45003: done checking for max_fail_percentage 34139 1726867644.45004: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.45005: done checking to see if all hosts have failed 34139 1726867644.45005: getting the remaining hosts for this loop 34139 1726867644.45006: done getting the remaining hosts for this loop 34139 1726867644.45010: getting the next task for host managed_node1 34139 1726867644.45016: done getting next task for host managed_node1 34139 1726867644.45020: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34139 1726867644.45023: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.45041: getting variables 34139 1726867644.45043: in VariableManager get_vars() 34139 1726867644.45086: Calling all_inventory to load vars for managed_node1 34139 1726867644.45089: Calling groups_inventory to load vars for managed_node1 34139 1726867644.45091: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.45101: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.45103: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.45184: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.45484: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000002b 34139 1726867644.45488: WORKER PROCESS EXITING 34139 1726867644.45513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.45752: done with get_vars() 34139 1726867644.45761: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:27:24 -0400 (0:00:00.020) 0:00:03.204 ****** 34139 1726867644.45852: entering _queue_task() for managed_node1/ping 34139 1726867644.45854: Creating lock for ping 34139 1726867644.46166: worker is 1 (out of 1 available) 34139 1726867644.46180: exiting _queue_task() for managed_node1/ping 34139 1726867644.46191: done queuing things up, now waiting for results queue to drain 34139 1726867644.46193: waiting for pending results... 34139 1726867644.46385: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 34139 1726867644.46516: in run() - task 0affcac9-a3a5-c103-b8fd-00000000002c 34139 1726867644.46539: variable 'ansible_search_path' from source: unknown 34139 1726867644.46547: variable 'ansible_search_path' from source: unknown 34139 1726867644.46590: calling self._execute() 34139 1726867644.46679: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.46691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.46704: variable 'omit' from source: magic vars 34139 1726867644.47078: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.47098: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.47223: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.47233: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.47239: when evaluation is False, skipping this task 34139 1726867644.47246: _execute() done 34139 1726867644.47251: dumping result to json 34139 1726867644.47259: done dumping result, returning 34139 1726867644.47269: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-c103-b8fd-00000000002c] 34139 1726867644.47279: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000002c 34139 1726867644.47483: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000002c 34139 1726867644.47487: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.47529: no more pending results, returning what we have 34139 1726867644.47532: results queue empty 34139 1726867644.47532: checking for any_errors_fatal 34139 1726867644.47544: done checking for any_errors_fatal 34139 1726867644.47545: checking for max_fail_percentage 34139 1726867644.47546: done checking for max_fail_percentage 34139 1726867644.47547: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.47548: done checking to see if all hosts have failed 34139 1726867644.47548: getting the remaining hosts for this loop 34139 1726867644.47550: done getting the remaining hosts for this loop 34139 1726867644.47554: getting the next task for host managed_node1 34139 1726867644.47563: done getting next task for host managed_node1 34139 1726867644.47564: ^ task is: TASK: meta (role_complete) 34139 1726867644.47567: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.47656: getting variables 34139 1726867644.47658: in VariableManager get_vars() 34139 1726867644.47698: Calling all_inventory to load vars for managed_node1 34139 1726867644.47700: Calling groups_inventory to load vars for managed_node1 34139 1726867644.47702: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.47710: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.47713: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.47716: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.47967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.48198: done with get_vars() 34139 1726867644.48208: done getting variables 34139 1726867644.48281: done queuing things up, now waiting for results queue to drain 34139 1726867644.48283: results queue empty 34139 1726867644.48284: checking for any_errors_fatal 34139 1726867644.48286: done checking for any_errors_fatal 34139 1726867644.48287: checking for max_fail_percentage 34139 1726867644.48288: done checking for max_fail_percentage 34139 1726867644.48288: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.48289: done checking to see if all hosts have failed 34139 1726867644.48290: getting the remaining hosts for this loop 34139 1726867644.48290: done getting the remaining hosts for this loop 34139 1726867644.48293: getting the next task for host managed_node1 34139 1726867644.48305: done getting next task for host managed_node1 34139 1726867644.48308: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34139 1726867644.48309: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.48318: getting variables 34139 1726867644.48319: in VariableManager get_vars() 34139 1726867644.48337: Calling all_inventory to load vars for managed_node1 34139 1726867644.48339: Calling groups_inventory to load vars for managed_node1 34139 1726867644.48341: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.48345: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.48347: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.48350: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.48501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.48726: done with get_vars() 34139 1726867644.48738: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:27:24 -0400 (0:00:00.029) 0:00:03.234 ****** 34139 1726867644.48804: entering _queue_task() for managed_node1/include_tasks 34139 1726867644.49037: worker is 1 (out of 1 available) 34139 1726867644.49049: exiting _queue_task() for managed_node1/include_tasks 34139 1726867644.49173: done queuing things up, now waiting for results queue to drain 34139 1726867644.49175: waiting for pending results... 34139 1726867644.49330: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34139 1726867644.49485: in run() - task 0affcac9-a3a5-c103-b8fd-000000000063 34139 1726867644.49489: variable 'ansible_search_path' from source: unknown 34139 1726867644.49492: variable 'ansible_search_path' from source: unknown 34139 1726867644.49528: calling self._execute() 34139 1726867644.49617: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.49682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.49686: variable 'omit' from source: magic vars 34139 1726867644.50028: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.50057: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.50181: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.50192: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.50199: when evaluation is False, skipping this task 34139 1726867644.50206: _execute() done 34139 1726867644.50212: dumping result to json 34139 1726867644.50219: done dumping result, returning 34139 1726867644.50228: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-c103-b8fd-000000000063] 34139 1726867644.50237: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000063 34139 1726867644.50445: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000063 34139 1726867644.50448: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.50528: no more pending results, returning what we have 34139 1726867644.50532: results queue empty 34139 1726867644.50533: checking for any_errors_fatal 34139 1726867644.50535: done checking for any_errors_fatal 34139 1726867644.50535: checking for max_fail_percentage 34139 1726867644.50537: done checking for max_fail_percentage 34139 1726867644.50538: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.50539: done checking to see if all hosts have failed 34139 1726867644.50539: getting the remaining hosts for this loop 34139 1726867644.50541: done getting the remaining hosts for this loop 34139 1726867644.50545: getting the next task for host managed_node1 34139 1726867644.50551: done getting next task for host managed_node1 34139 1726867644.50555: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34139 1726867644.50558: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.50575: getting variables 34139 1726867644.50680: in VariableManager get_vars() 34139 1726867644.50721: Calling all_inventory to load vars for managed_node1 34139 1726867644.50724: Calling groups_inventory to load vars for managed_node1 34139 1726867644.50726: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.50733: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.50736: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.50739: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.50971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.51205: done with get_vars() 34139 1726867644.51218: done getting variables 34139 1726867644.51271: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:27:24 -0400 (0:00:00.024) 0:00:03.259 ****** 34139 1726867644.51303: entering _queue_task() for managed_node1/debug 34139 1726867644.51661: worker is 1 (out of 1 available) 34139 1726867644.51671: exiting _queue_task() for managed_node1/debug 34139 1726867644.51681: done queuing things up, now waiting for results queue to drain 34139 1726867644.51683: waiting for pending results... 34139 1726867644.51854: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 34139 1726867644.52016: in run() - task 0affcac9-a3a5-c103-b8fd-000000000064 34139 1726867644.52020: variable 'ansible_search_path' from source: unknown 34139 1726867644.52022: variable 'ansible_search_path' from source: unknown 34139 1726867644.52048: calling self._execute() 34139 1726867644.52137: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.52183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.52186: variable 'omit' from source: magic vars 34139 1726867644.52542: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.52561: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.52814: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.52826: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.52832: when evaluation is False, skipping this task 34139 1726867644.52860: _execute() done 34139 1726867644.52863: dumping result to json 34139 1726867644.52866: done dumping result, returning 34139 1726867644.52869: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-c103-b8fd-000000000064] 34139 1726867644.52970: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000064 34139 1726867644.53035: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000064 34139 1726867644.53038: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867644.53119: no more pending results, returning what we have 34139 1726867644.53123: results queue empty 34139 1726867644.53123: checking for any_errors_fatal 34139 1726867644.53130: done checking for any_errors_fatal 34139 1726867644.53131: checking for max_fail_percentage 34139 1726867644.53132: done checking for max_fail_percentage 34139 1726867644.53133: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.53134: done checking to see if all hosts have failed 34139 1726867644.53135: getting the remaining hosts for this loop 34139 1726867644.53136: done getting the remaining hosts for this loop 34139 1726867644.53140: getting the next task for host managed_node1 34139 1726867644.53147: done getting next task for host managed_node1 34139 1726867644.53151: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34139 1726867644.53154: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.53172: getting variables 34139 1726867644.53174: in VariableManager get_vars() 34139 1726867644.53223: Calling all_inventory to load vars for managed_node1 34139 1726867644.53226: Calling groups_inventory to load vars for managed_node1 34139 1726867644.53228: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.53240: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.53244: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.53247: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.53528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.53765: done with get_vars() 34139 1726867644.53773: done getting variables 34139 1726867644.53823: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:27:24 -0400 (0:00:00.025) 0:00:03.284 ****** 34139 1726867644.53849: entering _queue_task() for managed_node1/fail 34139 1726867644.54059: worker is 1 (out of 1 available) 34139 1726867644.54072: exiting _queue_task() for managed_node1/fail 34139 1726867644.54085: done queuing things up, now waiting for results queue to drain 34139 1726867644.54087: waiting for pending results... 34139 1726867644.54559: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34139 1726867644.55154: in run() - task 0affcac9-a3a5-c103-b8fd-000000000065 34139 1726867644.55158: variable 'ansible_search_path' from source: unknown 34139 1726867644.55161: variable 'ansible_search_path' from source: unknown 34139 1726867644.55164: calling self._execute() 34139 1726867644.55166: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.55168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.55171: variable 'omit' from source: magic vars 34139 1726867644.56173: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.56186: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.56417: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.56426: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.56433: when evaluation is False, skipping this task 34139 1726867644.56439: _execute() done 34139 1726867644.56444: dumping result to json 34139 1726867644.56451: done dumping result, returning 34139 1726867644.56461: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-c103-b8fd-000000000065] 34139 1726867644.56684: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000065 34139 1726867644.56754: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000065 34139 1726867644.56757: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.56805: no more pending results, returning what we have 34139 1726867644.56808: results queue empty 34139 1726867644.56809: checking for any_errors_fatal 34139 1726867644.56813: done checking for any_errors_fatal 34139 1726867644.56814: checking for max_fail_percentage 34139 1726867644.56816: done checking for max_fail_percentage 34139 1726867644.56817: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.56817: done checking to see if all hosts have failed 34139 1726867644.56818: getting the remaining hosts for this loop 34139 1726867644.56820: done getting the remaining hosts for this loop 34139 1726867644.56824: getting the next task for host managed_node1 34139 1726867644.56832: done getting next task for host managed_node1 34139 1726867644.56835: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34139 1726867644.56839: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.56857: getting variables 34139 1726867644.56859: in VariableManager get_vars() 34139 1726867644.56909: Calling all_inventory to load vars for managed_node1 34139 1726867644.56912: Calling groups_inventory to load vars for managed_node1 34139 1726867644.56914: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.56926: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.56930: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.56934: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.57619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.57913: done with get_vars() 34139 1726867644.57925: done getting variables 34139 1726867644.57983: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:27:24 -0400 (0:00:00.041) 0:00:03.326 ****** 34139 1726867644.58014: entering _queue_task() for managed_node1/fail 34139 1726867644.58253: worker is 1 (out of 1 available) 34139 1726867644.58264: exiting _queue_task() for managed_node1/fail 34139 1726867644.58274: done queuing things up, now waiting for results queue to drain 34139 1726867644.58276: waiting for pending results... 34139 1726867644.58599: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34139 1726867644.58664: in run() - task 0affcac9-a3a5-c103-b8fd-000000000066 34139 1726867644.58687: variable 'ansible_search_path' from source: unknown 34139 1726867644.58694: variable 'ansible_search_path' from source: unknown 34139 1726867644.58734: calling self._execute() 34139 1726867644.58819: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.58833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.58883: variable 'omit' from source: magic vars 34139 1726867644.59213: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.59231: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.59426: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.59481: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.59484: when evaluation is False, skipping this task 34139 1726867644.59487: _execute() done 34139 1726867644.59489: dumping result to json 34139 1726867644.59590: done dumping result, returning 34139 1726867644.59594: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-c103-b8fd-000000000066] 34139 1726867644.59597: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000066 34139 1726867644.59668: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000066 34139 1726867644.59672: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.59734: no more pending results, returning what we have 34139 1726867644.59738: results queue empty 34139 1726867644.59738: checking for any_errors_fatal 34139 1726867644.59743: done checking for any_errors_fatal 34139 1726867644.59744: checking for max_fail_percentage 34139 1726867644.59746: done checking for max_fail_percentage 34139 1726867644.59747: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.59748: done checking to see if all hosts have failed 34139 1726867644.59749: getting the remaining hosts for this loop 34139 1726867644.59751: done getting the remaining hosts for this loop 34139 1726867644.59754: getting the next task for host managed_node1 34139 1726867644.59761: done getting next task for host managed_node1 34139 1726867644.59764: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34139 1726867644.59767: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.59789: getting variables 34139 1726867644.59791: in VariableManager get_vars() 34139 1726867644.59840: Calling all_inventory to load vars for managed_node1 34139 1726867644.59843: Calling groups_inventory to load vars for managed_node1 34139 1726867644.59845: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.59857: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.59860: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.59862: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.60225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.60451: done with get_vars() 34139 1726867644.60461: done getting variables 34139 1726867644.60520: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:27:24 -0400 (0:00:00.025) 0:00:03.351 ****** 34139 1726867644.60551: entering _queue_task() for managed_node1/fail 34139 1726867644.60991: worker is 1 (out of 1 available) 34139 1726867644.61005: exiting _queue_task() for managed_node1/fail 34139 1726867644.61016: done queuing things up, now waiting for results queue to drain 34139 1726867644.61018: waiting for pending results... 34139 1726867644.61497: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34139 1726867644.61502: in run() - task 0affcac9-a3a5-c103-b8fd-000000000067 34139 1726867644.61505: variable 'ansible_search_path' from source: unknown 34139 1726867644.61510: variable 'ansible_search_path' from source: unknown 34139 1726867644.61784: calling self._execute() 34139 1726867644.61819: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.61830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.61844: variable 'omit' from source: magic vars 34139 1726867644.62747: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.62805: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.63003: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.63182: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.63185: when evaluation is False, skipping this task 34139 1726867644.63188: _execute() done 34139 1726867644.63191: dumping result to json 34139 1726867644.63193: done dumping result, returning 34139 1726867644.63196: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-c103-b8fd-000000000067] 34139 1726867644.63199: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000067 34139 1726867644.63484: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000067 34139 1726867644.63488: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.63532: no more pending results, returning what we have 34139 1726867644.63535: results queue empty 34139 1726867644.63536: checking for any_errors_fatal 34139 1726867644.63541: done checking for any_errors_fatal 34139 1726867644.63541: checking for max_fail_percentage 34139 1726867644.63543: done checking for max_fail_percentage 34139 1726867644.63544: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.63544: done checking to see if all hosts have failed 34139 1726867644.63545: getting the remaining hosts for this loop 34139 1726867644.63547: done getting the remaining hosts for this loop 34139 1726867644.63550: getting the next task for host managed_node1 34139 1726867644.63556: done getting next task for host managed_node1 34139 1726867644.63559: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34139 1726867644.63562: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.63576: getting variables 34139 1726867644.63579: in VariableManager get_vars() 34139 1726867644.63627: Calling all_inventory to load vars for managed_node1 34139 1726867644.63629: Calling groups_inventory to load vars for managed_node1 34139 1726867644.63632: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.63642: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.63644: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.63647: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.64136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.64458: done with get_vars() 34139 1726867644.64469: done getting variables 34139 1726867644.64732: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:27:24 -0400 (0:00:00.042) 0:00:03.393 ****** 34139 1726867644.64763: entering _queue_task() for managed_node1/dnf 34139 1726867644.65406: worker is 1 (out of 1 available) 34139 1726867644.65420: exiting _queue_task() for managed_node1/dnf 34139 1726867644.65430: done queuing things up, now waiting for results queue to drain 34139 1726867644.65432: waiting for pending results... 34139 1726867644.65649: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34139 1726867644.65910: in run() - task 0affcac9-a3a5-c103-b8fd-000000000068 34139 1726867644.65914: variable 'ansible_search_path' from source: unknown 34139 1726867644.65917: variable 'ansible_search_path' from source: unknown 34139 1726867644.65919: calling self._execute() 34139 1726867644.65970: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.65983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.65998: variable 'omit' from source: magic vars 34139 1726867644.66380: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.66397: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.66521: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.66532: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.66562: when evaluation is False, skipping this task 34139 1726867644.66566: _execute() done 34139 1726867644.66568: dumping result to json 34139 1726867644.66576: done dumping result, returning 34139 1726867644.66581: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-c103-b8fd-000000000068] 34139 1726867644.66672: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000068 34139 1726867644.66742: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000068 34139 1726867644.66745: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.66795: no more pending results, returning what we have 34139 1726867644.66798: results queue empty 34139 1726867644.66799: checking for any_errors_fatal 34139 1726867644.66804: done checking for any_errors_fatal 34139 1726867644.66804: checking for max_fail_percentage 34139 1726867644.66806: done checking for max_fail_percentage 34139 1726867644.66806: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.66807: done checking to see if all hosts have failed 34139 1726867644.66808: getting the remaining hosts for this loop 34139 1726867644.66809: done getting the remaining hosts for this loop 34139 1726867644.66812: getting the next task for host managed_node1 34139 1726867644.66819: done getting next task for host managed_node1 34139 1726867644.66822: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34139 1726867644.66825: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.66839: getting variables 34139 1726867644.66841: in VariableManager get_vars() 34139 1726867644.66990: Calling all_inventory to load vars for managed_node1 34139 1726867644.66993: Calling groups_inventory to load vars for managed_node1 34139 1726867644.66996: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.67007: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.67009: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.67012: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.67282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.67542: done with get_vars() 34139 1726867644.67553: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34139 1726867644.67668: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:27:24 -0400 (0:00:00.029) 0:00:03.423 ****** 34139 1726867644.67698: entering _queue_task() for managed_node1/yum 34139 1726867644.67934: worker is 1 (out of 1 available) 34139 1726867644.67947: exiting _queue_task() for managed_node1/yum 34139 1726867644.67960: done queuing things up, now waiting for results queue to drain 34139 1726867644.67962: waiting for pending results... 34139 1726867644.68307: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34139 1726867644.68385: in run() - task 0affcac9-a3a5-c103-b8fd-000000000069 34139 1726867644.68389: variable 'ansible_search_path' from source: unknown 34139 1726867644.68392: variable 'ansible_search_path' from source: unknown 34139 1726867644.68432: calling self._execute() 34139 1726867644.68518: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.68582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.68586: variable 'omit' from source: magic vars 34139 1726867644.68913: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.68932: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.69058: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.69069: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.69079: when evaluation is False, skipping this task 34139 1726867644.69088: _execute() done 34139 1726867644.69095: dumping result to json 34139 1726867644.69103: done dumping result, returning 34139 1726867644.69114: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-c103-b8fd-000000000069] 34139 1726867644.69164: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000069 34139 1726867644.69233: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000069 34139 1726867644.69236: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.69319: no more pending results, returning what we have 34139 1726867644.69322: results queue empty 34139 1726867644.69323: checking for any_errors_fatal 34139 1726867644.69330: done checking for any_errors_fatal 34139 1726867644.69331: checking for max_fail_percentage 34139 1726867644.69333: done checking for max_fail_percentage 34139 1726867644.69334: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.69335: done checking to see if all hosts have failed 34139 1726867644.69335: getting the remaining hosts for this loop 34139 1726867644.69337: done getting the remaining hosts for this loop 34139 1726867644.69341: getting the next task for host managed_node1 34139 1726867644.69348: done getting next task for host managed_node1 34139 1726867644.69352: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34139 1726867644.69355: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.69372: getting variables 34139 1726867644.69380: in VariableManager get_vars() 34139 1726867644.69427: Calling all_inventory to load vars for managed_node1 34139 1726867644.69430: Calling groups_inventory to load vars for managed_node1 34139 1726867644.69432: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.69443: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.69446: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.69449: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.69745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.70023: done with get_vars() 34139 1726867644.70033: done getting variables 34139 1726867644.70091: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:27:24 -0400 (0:00:00.024) 0:00:03.447 ****** 34139 1726867644.70120: entering _queue_task() for managed_node1/fail 34139 1726867644.70345: worker is 1 (out of 1 available) 34139 1726867644.70358: exiting _queue_task() for managed_node1/fail 34139 1726867644.70370: done queuing things up, now waiting for results queue to drain 34139 1726867644.70372: waiting for pending results... 34139 1726867644.71003: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34139 1726867644.71008: in run() - task 0affcac9-a3a5-c103-b8fd-00000000006a 34139 1726867644.71012: variable 'ansible_search_path' from source: unknown 34139 1726867644.71087: variable 'ansible_search_path' from source: unknown 34139 1726867644.71316: calling self._execute() 34139 1726867644.71324: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.71335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.71435: variable 'omit' from source: magic vars 34139 1726867644.71953: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.71973: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.72097: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.72107: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.72114: when evaluation is False, skipping this task 34139 1726867644.72121: _execute() done 34139 1726867644.72127: dumping result to json 34139 1726867644.72135: done dumping result, returning 34139 1726867644.72144: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-c103-b8fd-00000000006a] 34139 1726867644.72153: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000006a 34139 1726867644.72254: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000006a 34139 1726867644.72257: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.72334: no more pending results, returning what we have 34139 1726867644.72336: results queue empty 34139 1726867644.72337: checking for any_errors_fatal 34139 1726867644.72341: done checking for any_errors_fatal 34139 1726867644.72342: checking for max_fail_percentage 34139 1726867644.72343: done checking for max_fail_percentage 34139 1726867644.72344: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.72345: done checking to see if all hosts have failed 34139 1726867644.72345: getting the remaining hosts for this loop 34139 1726867644.72347: done getting the remaining hosts for this loop 34139 1726867644.72350: getting the next task for host managed_node1 34139 1726867644.72355: done getting next task for host managed_node1 34139 1726867644.72358: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34139 1726867644.72361: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.72375: getting variables 34139 1726867644.72378: in VariableManager get_vars() 34139 1726867644.72415: Calling all_inventory to load vars for managed_node1 34139 1726867644.72417: Calling groups_inventory to load vars for managed_node1 34139 1726867644.72419: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.72427: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.72430: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.72433: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.72643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.72858: done with get_vars() 34139 1726867644.72868: done getting variables 34139 1726867644.72924: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:27:24 -0400 (0:00:00.028) 0:00:03.475 ****** 34139 1726867644.72952: entering _queue_task() for managed_node1/package 34139 1726867644.73160: worker is 1 (out of 1 available) 34139 1726867644.73173: exiting _queue_task() for managed_node1/package 34139 1726867644.73385: done queuing things up, now waiting for results queue to drain 34139 1726867644.73387: waiting for pending results... 34139 1726867644.73442: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 34139 1726867644.73584: in run() - task 0affcac9-a3a5-c103-b8fd-00000000006b 34139 1726867644.73591: variable 'ansible_search_path' from source: unknown 34139 1726867644.73611: variable 'ansible_search_path' from source: unknown 34139 1726867644.73712: calling self._execute() 34139 1726867644.73884: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.73887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.73890: variable 'omit' from source: magic vars 34139 1726867644.74169: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.74189: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.74305: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.74316: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.74327: when evaluation is False, skipping this task 34139 1726867644.74339: _execute() done 34139 1726867644.74345: dumping result to json 34139 1726867644.74353: done dumping result, returning 34139 1726867644.74363: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-c103-b8fd-00000000006b] 34139 1726867644.74372: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000006b 34139 1726867644.74584: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000006b 34139 1726867644.74588: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.74631: no more pending results, returning what we have 34139 1726867644.74634: results queue empty 34139 1726867644.74635: checking for any_errors_fatal 34139 1726867644.74640: done checking for any_errors_fatal 34139 1726867644.74641: checking for max_fail_percentage 34139 1726867644.74643: done checking for max_fail_percentage 34139 1726867644.74643: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.74644: done checking to see if all hosts have failed 34139 1726867644.74645: getting the remaining hosts for this loop 34139 1726867644.74646: done getting the remaining hosts for this loop 34139 1726867644.74649: getting the next task for host managed_node1 34139 1726867644.74655: done getting next task for host managed_node1 34139 1726867644.74657: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34139 1726867644.74660: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.74675: getting variables 34139 1726867644.74676: in VariableManager get_vars() 34139 1726867644.74721: Calling all_inventory to load vars for managed_node1 34139 1726867644.74723: Calling groups_inventory to load vars for managed_node1 34139 1726867644.74726: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.74735: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.74738: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.74741: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.74999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.75217: done with get_vars() 34139 1726867644.75227: done getting variables 34139 1726867644.75279: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:27:24 -0400 (0:00:00.023) 0:00:03.499 ****** 34139 1726867644.75306: entering _queue_task() for managed_node1/package 34139 1726867644.75545: worker is 1 (out of 1 available) 34139 1726867644.75556: exiting _queue_task() for managed_node1/package 34139 1726867644.75566: done queuing things up, now waiting for results queue to drain 34139 1726867644.75568: waiting for pending results... 34139 1726867644.75895: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34139 1726867644.75948: in run() - task 0affcac9-a3a5-c103-b8fd-00000000006c 34139 1726867644.75966: variable 'ansible_search_path' from source: unknown 34139 1726867644.75973: variable 'ansible_search_path' from source: unknown 34139 1726867644.76015: calling self._execute() 34139 1726867644.76089: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.76106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.76122: variable 'omit' from source: magic vars 34139 1726867644.76460: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.76475: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.76592: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.76644: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.76647: when evaluation is False, skipping this task 34139 1726867644.76650: _execute() done 34139 1726867644.76652: dumping result to json 34139 1726867644.76655: done dumping result, returning 34139 1726867644.76658: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-c103-b8fd-00000000006c] 34139 1726867644.76660: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000006c skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.76910: no more pending results, returning what we have 34139 1726867644.76912: results queue empty 34139 1726867644.76913: checking for any_errors_fatal 34139 1726867644.76917: done checking for any_errors_fatal 34139 1726867644.76917: checking for max_fail_percentage 34139 1726867644.76919: done checking for max_fail_percentage 34139 1726867644.76919: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.76920: done checking to see if all hosts have failed 34139 1726867644.76921: getting the remaining hosts for this loop 34139 1726867644.76922: done getting the remaining hosts for this loop 34139 1726867644.76924: getting the next task for host managed_node1 34139 1726867644.76929: done getting next task for host managed_node1 34139 1726867644.76931: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34139 1726867644.76934: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.76947: getting variables 34139 1726867644.76948: in VariableManager get_vars() 34139 1726867644.76982: Calling all_inventory to load vars for managed_node1 34139 1726867644.76985: Calling groups_inventory to load vars for managed_node1 34139 1726867644.76987: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.76994: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.76996: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.76999: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.77236: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000006c 34139 1726867644.77240: WORKER PROCESS EXITING 34139 1726867644.77262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.77480: done with get_vars() 34139 1726867644.77489: done getting variables 34139 1726867644.77605: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:27:24 -0400 (0:00:00.023) 0:00:03.522 ****** 34139 1726867644.77648: entering _queue_task() for managed_node1/package 34139 1726867644.78084: worker is 1 (out of 1 available) 34139 1726867644.78095: exiting _queue_task() for managed_node1/package 34139 1726867644.78106: done queuing things up, now waiting for results queue to drain 34139 1726867644.78108: waiting for pending results... 34139 1726867644.78640: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34139 1726867644.78803: in run() - task 0affcac9-a3a5-c103-b8fd-00000000006d 34139 1726867644.78917: variable 'ansible_search_path' from source: unknown 34139 1726867644.79019: variable 'ansible_search_path' from source: unknown 34139 1726867644.79023: calling self._execute() 34139 1726867644.79167: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.79219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.79242: variable 'omit' from source: magic vars 34139 1726867644.79765: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.79788: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.79953: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.80003: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.80006: when evaluation is False, skipping this task 34139 1726867644.80016: _execute() done 34139 1726867644.80019: dumping result to json 34139 1726867644.80021: done dumping result, returning 34139 1726867644.80023: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-c103-b8fd-00000000006d] 34139 1726867644.80025: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000006d skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.80292: no more pending results, returning what we have 34139 1726867644.80296: results queue empty 34139 1726867644.80296: checking for any_errors_fatal 34139 1726867644.80302: done checking for any_errors_fatal 34139 1726867644.80303: checking for max_fail_percentage 34139 1726867644.80305: done checking for max_fail_percentage 34139 1726867644.80305: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.80306: done checking to see if all hosts have failed 34139 1726867644.80309: getting the remaining hosts for this loop 34139 1726867644.80311: done getting the remaining hosts for this loop 34139 1726867644.80314: getting the next task for host managed_node1 34139 1726867644.80322: done getting next task for host managed_node1 34139 1726867644.80325: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34139 1726867644.80328: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.80459: getting variables 34139 1726867644.80461: in VariableManager get_vars() 34139 1726867644.80501: Calling all_inventory to load vars for managed_node1 34139 1726867644.80504: Calling groups_inventory to load vars for managed_node1 34139 1726867644.80506: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.80517: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.80520: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.80523: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.80772: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000006d 34139 1726867644.80785: WORKER PROCESS EXITING 34139 1726867644.80791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.80926: done with get_vars() 34139 1726867644.80938: done getting variables 34139 1726867644.81013: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:27:24 -0400 (0:00:00.033) 0:00:03.556 ****** 34139 1726867644.81043: entering _queue_task() for managed_node1/service 34139 1726867644.81301: worker is 1 (out of 1 available) 34139 1726867644.81321: exiting _queue_task() for managed_node1/service 34139 1726867644.81332: done queuing things up, now waiting for results queue to drain 34139 1726867644.81334: waiting for pending results... 34139 1726867644.81627: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34139 1726867644.81783: in run() - task 0affcac9-a3a5-c103-b8fd-00000000006e 34139 1726867644.81787: variable 'ansible_search_path' from source: unknown 34139 1726867644.81789: variable 'ansible_search_path' from source: unknown 34139 1726867644.81799: calling self._execute() 34139 1726867644.81889: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.81899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.81917: variable 'omit' from source: magic vars 34139 1726867644.82357: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.82375: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.82828: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.82832: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.82835: when evaluation is False, skipping this task 34139 1726867644.82837: _execute() done 34139 1726867644.82840: dumping result to json 34139 1726867644.82842: done dumping result, returning 34139 1726867644.82844: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-c103-b8fd-00000000006e] 34139 1726867644.82846: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000006e 34139 1726867644.82922: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000006e 34139 1726867644.82925: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.82968: no more pending results, returning what we have 34139 1726867644.82970: results queue empty 34139 1726867644.82971: checking for any_errors_fatal 34139 1726867644.82976: done checking for any_errors_fatal 34139 1726867644.82979: checking for max_fail_percentage 34139 1726867644.82981: done checking for max_fail_percentage 34139 1726867644.82981: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.82983: done checking to see if all hosts have failed 34139 1726867644.82983: getting the remaining hosts for this loop 34139 1726867644.82985: done getting the remaining hosts for this loop 34139 1726867644.82988: getting the next task for host managed_node1 34139 1726867644.82993: done getting next task for host managed_node1 34139 1726867644.82996: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34139 1726867644.82999: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.83016: getting variables 34139 1726867644.83018: in VariableManager get_vars() 34139 1726867644.83066: Calling all_inventory to load vars for managed_node1 34139 1726867644.83069: Calling groups_inventory to load vars for managed_node1 34139 1726867644.83072: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.83152: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.83156: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.83159: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.83409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.83640: done with get_vars() 34139 1726867644.83649: done getting variables 34139 1726867644.83715: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:27:24 -0400 (0:00:00.027) 0:00:03.583 ****** 34139 1726867644.83744: entering _queue_task() for managed_node1/service 34139 1726867644.84084: worker is 1 (out of 1 available) 34139 1726867644.84098: exiting _queue_task() for managed_node1/service 34139 1726867644.84108: done queuing things up, now waiting for results queue to drain 34139 1726867644.84110: waiting for pending results... 34139 1726867644.84356: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34139 1726867644.84442: in run() - task 0affcac9-a3a5-c103-b8fd-00000000006f 34139 1726867644.84452: variable 'ansible_search_path' from source: unknown 34139 1726867644.84456: variable 'ansible_search_path' from source: unknown 34139 1726867644.84486: calling self._execute() 34139 1726867644.84549: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.84555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.84563: variable 'omit' from source: magic vars 34139 1726867644.84830: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.84841: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.84924: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.84927: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.84930: when evaluation is False, skipping this task 34139 1726867644.84932: _execute() done 34139 1726867644.84935: dumping result to json 34139 1726867644.84941: done dumping result, returning 34139 1726867644.84946: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-c103-b8fd-00000000006f] 34139 1726867644.84951: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000006f 34139 1726867644.85036: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000006f 34139 1726867644.85039: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34139 1726867644.85082: no more pending results, returning what we have 34139 1726867644.85088: results queue empty 34139 1726867644.85089: checking for any_errors_fatal 34139 1726867644.85095: done checking for any_errors_fatal 34139 1726867644.85095: checking for max_fail_percentage 34139 1726867644.85097: done checking for max_fail_percentage 34139 1726867644.85097: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.85098: done checking to see if all hosts have failed 34139 1726867644.85099: getting the remaining hosts for this loop 34139 1726867644.85100: done getting the remaining hosts for this loop 34139 1726867644.85103: getting the next task for host managed_node1 34139 1726867644.85109: done getting next task for host managed_node1 34139 1726867644.85112: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34139 1726867644.85115: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.85129: getting variables 34139 1726867644.85130: in VariableManager get_vars() 34139 1726867644.85166: Calling all_inventory to load vars for managed_node1 34139 1726867644.85168: Calling groups_inventory to load vars for managed_node1 34139 1726867644.85170: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.85180: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.85183: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.85186: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.85306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.85453: done with get_vars() 34139 1726867644.85460: done getting variables 34139 1726867644.85500: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:27:24 -0400 (0:00:00.017) 0:00:03.601 ****** 34139 1726867644.85523: entering _queue_task() for managed_node1/service 34139 1726867644.85702: worker is 1 (out of 1 available) 34139 1726867644.85717: exiting _queue_task() for managed_node1/service 34139 1726867644.85730: done queuing things up, now waiting for results queue to drain 34139 1726867644.85731: waiting for pending results... 34139 1726867644.85887: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34139 1726867644.85972: in run() - task 0affcac9-a3a5-c103-b8fd-000000000070 34139 1726867644.85984: variable 'ansible_search_path' from source: unknown 34139 1726867644.85987: variable 'ansible_search_path' from source: unknown 34139 1726867644.86142: calling self._execute() 34139 1726867644.86267: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.86696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.86700: variable 'omit' from source: magic vars 34139 1726867644.87097: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.87118: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.87336: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.87394: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.87425: when evaluation is False, skipping this task 34139 1726867644.87461: _execute() done 34139 1726867644.87490: dumping result to json 34139 1726867644.87530: done dumping result, returning 34139 1726867644.87592: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-c103-b8fd-000000000070] 34139 1726867644.87604: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000070 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.87798: no more pending results, returning what we have 34139 1726867644.87801: results queue empty 34139 1726867644.87802: checking for any_errors_fatal 34139 1726867644.87807: done checking for any_errors_fatal 34139 1726867644.87808: checking for max_fail_percentage 34139 1726867644.87810: done checking for max_fail_percentage 34139 1726867644.87810: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.87811: done checking to see if all hosts have failed 34139 1726867644.87812: getting the remaining hosts for this loop 34139 1726867644.87813: done getting the remaining hosts for this loop 34139 1726867644.87816: getting the next task for host managed_node1 34139 1726867644.87823: done getting next task for host managed_node1 34139 1726867644.87826: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34139 1726867644.87829: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.87845: getting variables 34139 1726867644.87847: in VariableManager get_vars() 34139 1726867644.87896: Calling all_inventory to load vars for managed_node1 34139 1726867644.87899: Calling groups_inventory to load vars for managed_node1 34139 1726867644.87901: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.87911: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.87913: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.87916: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.88078: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000070 34139 1726867644.88082: WORKER PROCESS EXITING 34139 1726867644.88094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.88231: done with get_vars() 34139 1726867644.88239: done getting variables 34139 1726867644.88280: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:27:24 -0400 (0:00:00.027) 0:00:03.629 ****** 34139 1726867644.88301: entering _queue_task() for managed_node1/service 34139 1726867644.88484: worker is 1 (out of 1 available) 34139 1726867644.88498: exiting _queue_task() for managed_node1/service 34139 1726867644.88513: done queuing things up, now waiting for results queue to drain 34139 1726867644.88515: waiting for pending results... 34139 1726867644.88687: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 34139 1726867644.88762: in run() - task 0affcac9-a3a5-c103-b8fd-000000000071 34139 1726867644.88773: variable 'ansible_search_path' from source: unknown 34139 1726867644.88779: variable 'ansible_search_path' from source: unknown 34139 1726867644.88809: calling self._execute() 34139 1726867644.88864: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.88868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.88878: variable 'omit' from source: magic vars 34139 1726867644.89139: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.89149: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.89274: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.89297: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.89301: when evaluation is False, skipping this task 34139 1726867644.89303: _execute() done 34139 1726867644.89306: dumping result to json 34139 1726867644.89308: done dumping result, returning 34139 1726867644.89310: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-c103-b8fd-000000000071] 34139 1726867644.89312: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000071 34139 1726867644.89417: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000071 34139 1726867644.89420: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34139 1726867644.89542: no more pending results, returning what we have 34139 1726867644.89546: results queue empty 34139 1726867644.89547: checking for any_errors_fatal 34139 1726867644.89551: done checking for any_errors_fatal 34139 1726867644.89552: checking for max_fail_percentage 34139 1726867644.89553: done checking for max_fail_percentage 34139 1726867644.89554: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.89555: done checking to see if all hosts have failed 34139 1726867644.89555: getting the remaining hosts for this loop 34139 1726867644.89556: done getting the remaining hosts for this loop 34139 1726867644.89559: getting the next task for host managed_node1 34139 1726867644.89564: done getting next task for host managed_node1 34139 1726867644.89566: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34139 1726867644.89569: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.89583: getting variables 34139 1726867644.89585: in VariableManager get_vars() 34139 1726867644.89619: Calling all_inventory to load vars for managed_node1 34139 1726867644.89621: Calling groups_inventory to load vars for managed_node1 34139 1726867644.89623: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.89631: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.89633: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.89636: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.89842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.90039: done with get_vars() 34139 1726867644.90048: done getting variables 34139 1726867644.90098: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:27:24 -0400 (0:00:00.018) 0:00:03.647 ****** 34139 1726867644.90128: entering _queue_task() for managed_node1/copy 34139 1726867644.90351: worker is 1 (out of 1 available) 34139 1726867644.90364: exiting _queue_task() for managed_node1/copy 34139 1726867644.90378: done queuing things up, now waiting for results queue to drain 34139 1726867644.90380: waiting for pending results... 34139 1726867644.90692: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34139 1726867644.90787: in run() - task 0affcac9-a3a5-c103-b8fd-000000000072 34139 1726867644.90793: variable 'ansible_search_path' from source: unknown 34139 1726867644.90795: variable 'ansible_search_path' from source: unknown 34139 1726867644.90825: calling self._execute() 34139 1726867644.90887: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.90890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.90900: variable 'omit' from source: magic vars 34139 1726867644.91250: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.91485: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.91489: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.91491: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.91494: when evaluation is False, skipping this task 34139 1726867644.91496: _execute() done 34139 1726867644.91498: dumping result to json 34139 1726867644.91500: done dumping result, returning 34139 1726867644.91503: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-c103-b8fd-000000000072] 34139 1726867644.91505: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000072 34139 1726867644.91579: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000072 34139 1726867644.91583: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.91634: no more pending results, returning what we have 34139 1726867644.91637: results queue empty 34139 1726867644.91638: checking for any_errors_fatal 34139 1726867644.91644: done checking for any_errors_fatal 34139 1726867644.91645: checking for max_fail_percentage 34139 1726867644.91646: done checking for max_fail_percentage 34139 1726867644.91647: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.91648: done checking to see if all hosts have failed 34139 1726867644.91649: getting the remaining hosts for this loop 34139 1726867644.91650: done getting the remaining hosts for this loop 34139 1726867644.91654: getting the next task for host managed_node1 34139 1726867644.91661: done getting next task for host managed_node1 34139 1726867644.91664: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34139 1726867644.91667: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.91685: getting variables 34139 1726867644.91687: in VariableManager get_vars() 34139 1726867644.91734: Calling all_inventory to load vars for managed_node1 34139 1726867644.91737: Calling groups_inventory to load vars for managed_node1 34139 1726867644.91739: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.91750: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.91753: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.91756: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.92085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.92268: done with get_vars() 34139 1726867644.92275: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:27:24 -0400 (0:00:00.022) 0:00:03.669 ****** 34139 1726867644.92339: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34139 1726867644.92511: worker is 1 (out of 1 available) 34139 1726867644.92523: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34139 1726867644.92534: done queuing things up, now waiting for results queue to drain 34139 1726867644.92536: waiting for pending results... 34139 1726867644.92695: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34139 1726867644.92771: in run() - task 0affcac9-a3a5-c103-b8fd-000000000073 34139 1726867644.92784: variable 'ansible_search_path' from source: unknown 34139 1726867644.92787: variable 'ansible_search_path' from source: unknown 34139 1726867644.92815: calling self._execute() 34139 1726867644.92876: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.92882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.92887: variable 'omit' from source: magic vars 34139 1726867644.93147: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.93156: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.93234: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.93238: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.93241: when evaluation is False, skipping this task 34139 1726867644.93245: _execute() done 34139 1726867644.93248: dumping result to json 34139 1726867644.93251: done dumping result, returning 34139 1726867644.93259: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-c103-b8fd-000000000073] 34139 1726867644.93264: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000073 34139 1726867644.93352: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000073 34139 1726867644.93355: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.93410: no more pending results, returning what we have 34139 1726867644.93412: results queue empty 34139 1726867644.93413: checking for any_errors_fatal 34139 1726867644.93418: done checking for any_errors_fatal 34139 1726867644.93419: checking for max_fail_percentage 34139 1726867644.93420: done checking for max_fail_percentage 34139 1726867644.93421: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.93422: done checking to see if all hosts have failed 34139 1726867644.93423: getting the remaining hosts for this loop 34139 1726867644.93424: done getting the remaining hosts for this loop 34139 1726867644.93426: getting the next task for host managed_node1 34139 1726867644.93431: done getting next task for host managed_node1 34139 1726867644.93434: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34139 1726867644.93436: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.93449: getting variables 34139 1726867644.93450: in VariableManager get_vars() 34139 1726867644.93482: Calling all_inventory to load vars for managed_node1 34139 1726867644.93484: Calling groups_inventory to load vars for managed_node1 34139 1726867644.93485: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.93496: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.93498: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.93499: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.93639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.93765: done with get_vars() 34139 1726867644.93772: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:27:24 -0400 (0:00:00.014) 0:00:03.684 ****** 34139 1726867644.93829: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34139 1726867644.94006: worker is 1 (out of 1 available) 34139 1726867644.94021: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34139 1726867644.94032: done queuing things up, now waiting for results queue to drain 34139 1726867644.94034: waiting for pending results... 34139 1726867644.94410: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 34139 1726867644.94421: in run() - task 0affcac9-a3a5-c103-b8fd-000000000074 34139 1726867644.94439: variable 'ansible_search_path' from source: unknown 34139 1726867644.94484: variable 'ansible_search_path' from source: unknown 34139 1726867644.94488: calling self._execute() 34139 1726867644.94564: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.94574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.94592: variable 'omit' from source: magic vars 34139 1726867644.94958: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.94992: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.95066: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.95070: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.95072: when evaluation is False, skipping this task 34139 1726867644.95078: _execute() done 34139 1726867644.95081: dumping result to json 34139 1726867644.95083: done dumping result, returning 34139 1726867644.95090: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-c103-b8fd-000000000074] 34139 1726867644.95095: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000074 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867644.95232: no more pending results, returning what we have 34139 1726867644.95235: results queue empty 34139 1726867644.95236: checking for any_errors_fatal 34139 1726867644.95240: done checking for any_errors_fatal 34139 1726867644.95241: checking for max_fail_percentage 34139 1726867644.95242: done checking for max_fail_percentage 34139 1726867644.95243: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.95244: done checking to see if all hosts have failed 34139 1726867644.95245: getting the remaining hosts for this loop 34139 1726867644.95246: done getting the remaining hosts for this loop 34139 1726867644.95248: getting the next task for host managed_node1 34139 1726867644.95254: done getting next task for host managed_node1 34139 1726867644.95256: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34139 1726867644.95259: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.95287: getting variables 34139 1726867644.95288: in VariableManager get_vars() 34139 1726867644.95319: Calling all_inventory to load vars for managed_node1 34139 1726867644.95320: Calling groups_inventory to load vars for managed_node1 34139 1726867644.95322: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.95329: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.95330: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.95333: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.95446: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000074 34139 1726867644.95449: WORKER PROCESS EXITING 34139 1726867644.95458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.95584: done with get_vars() 34139 1726867644.95594: done getting variables 34139 1726867644.95632: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:27:24 -0400 (0:00:00.018) 0:00:03.702 ****** 34139 1726867644.95653: entering _queue_task() for managed_node1/debug 34139 1726867644.95811: worker is 1 (out of 1 available) 34139 1726867644.95824: exiting _queue_task() for managed_node1/debug 34139 1726867644.95834: done queuing things up, now waiting for results queue to drain 34139 1726867644.95836: waiting for pending results... 34139 1726867644.95987: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34139 1726867644.96055: in run() - task 0affcac9-a3a5-c103-b8fd-000000000075 34139 1726867644.96069: variable 'ansible_search_path' from source: unknown 34139 1726867644.96073: variable 'ansible_search_path' from source: unknown 34139 1726867644.96103: calling self._execute() 34139 1726867644.96157: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.96161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.96169: variable 'omit' from source: magic vars 34139 1726867644.96648: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.96657: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.96732: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.96736: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.96739: when evaluation is False, skipping this task 34139 1726867644.96743: _execute() done 34139 1726867644.96746: dumping result to json 34139 1726867644.96748: done dumping result, returning 34139 1726867644.96756: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-c103-b8fd-000000000075] 34139 1726867644.96759: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000075 34139 1726867644.96839: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000075 34139 1726867644.96841: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867644.96886: no more pending results, returning what we have 34139 1726867644.96889: results queue empty 34139 1726867644.96890: checking for any_errors_fatal 34139 1726867644.96895: done checking for any_errors_fatal 34139 1726867644.96895: checking for max_fail_percentage 34139 1726867644.96897: done checking for max_fail_percentage 34139 1726867644.96897: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.96898: done checking to see if all hosts have failed 34139 1726867644.96899: getting the remaining hosts for this loop 34139 1726867644.96900: done getting the remaining hosts for this loop 34139 1726867644.96903: getting the next task for host managed_node1 34139 1726867644.96909: done getting next task for host managed_node1 34139 1726867644.96912: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34139 1726867644.96915: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.96928: getting variables 34139 1726867644.96929: in VariableManager get_vars() 34139 1726867644.96965: Calling all_inventory to load vars for managed_node1 34139 1726867644.96967: Calling groups_inventory to load vars for managed_node1 34139 1726867644.96968: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.96974: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.96975: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.96979: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.97245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.97367: done with get_vars() 34139 1726867644.97373: done getting variables 34139 1726867644.97417: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:27:24 -0400 (0:00:00.017) 0:00:03.720 ****** 34139 1726867644.97435: entering _queue_task() for managed_node1/debug 34139 1726867644.97592: worker is 1 (out of 1 available) 34139 1726867644.97605: exiting _queue_task() for managed_node1/debug 34139 1726867644.97617: done queuing things up, now waiting for results queue to drain 34139 1726867644.97619: waiting for pending results... 34139 1726867644.97762: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34139 1726867644.97834: in run() - task 0affcac9-a3a5-c103-b8fd-000000000076 34139 1726867644.97850: variable 'ansible_search_path' from source: unknown 34139 1726867644.97854: variable 'ansible_search_path' from source: unknown 34139 1726867644.97875: calling self._execute() 34139 1726867644.97930: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.97934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.97941: variable 'omit' from source: magic vars 34139 1726867644.98181: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.98194: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.98268: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.98271: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.98276: when evaluation is False, skipping this task 34139 1726867644.98287: _execute() done 34139 1726867644.98290: dumping result to json 34139 1726867644.98293: done dumping result, returning 34139 1726867644.98295: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-c103-b8fd-000000000076] 34139 1726867644.98298: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000076 34139 1726867644.98370: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000076 34139 1726867644.98373: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867644.98441: no more pending results, returning what we have 34139 1726867644.98443: results queue empty 34139 1726867644.98444: checking for any_errors_fatal 34139 1726867644.98449: done checking for any_errors_fatal 34139 1726867644.98450: checking for max_fail_percentage 34139 1726867644.98451: done checking for max_fail_percentage 34139 1726867644.98452: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.98453: done checking to see if all hosts have failed 34139 1726867644.98454: getting the remaining hosts for this loop 34139 1726867644.98455: done getting the remaining hosts for this loop 34139 1726867644.98458: getting the next task for host managed_node1 34139 1726867644.98462: done getting next task for host managed_node1 34139 1726867644.98465: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34139 1726867644.98467: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.98476: getting variables 34139 1726867644.98479: in VariableManager get_vars() 34139 1726867644.98511: Calling all_inventory to load vars for managed_node1 34139 1726867644.98513: Calling groups_inventory to load vars for managed_node1 34139 1726867644.98515: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.98520: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.98522: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.98524: Calling groups_plugins_play to load vars for managed_node1 34139 1726867644.98630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867644.98773: done with get_vars() 34139 1726867644.98782: done getting variables 34139 1726867644.98819: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:27:24 -0400 (0:00:00.014) 0:00:03.734 ****** 34139 1726867644.98840: entering _queue_task() for managed_node1/debug 34139 1726867644.99000: worker is 1 (out of 1 available) 34139 1726867644.99013: exiting _queue_task() for managed_node1/debug 34139 1726867644.99024: done queuing things up, now waiting for results queue to drain 34139 1726867644.99026: waiting for pending results... 34139 1726867644.99161: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34139 1726867644.99231: in run() - task 0affcac9-a3a5-c103-b8fd-000000000077 34139 1726867644.99241: variable 'ansible_search_path' from source: unknown 34139 1726867644.99245: variable 'ansible_search_path' from source: unknown 34139 1726867644.99270: calling self._execute() 34139 1726867644.99329: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867644.99332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867644.99341: variable 'omit' from source: magic vars 34139 1726867644.99581: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.99590: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867644.99664: variable 'ansible_distribution_major_version' from source: facts 34139 1726867644.99668: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867644.99671: when evaluation is False, skipping this task 34139 1726867644.99673: _execute() done 34139 1726867644.99676: dumping result to json 34139 1726867644.99680: done dumping result, returning 34139 1726867644.99688: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-c103-b8fd-000000000077] 34139 1726867644.99691: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000077 34139 1726867644.99769: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000077 34139 1726867644.99771: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867644.99829: no more pending results, returning what we have 34139 1726867644.99832: results queue empty 34139 1726867644.99833: checking for any_errors_fatal 34139 1726867644.99837: done checking for any_errors_fatal 34139 1726867644.99838: checking for max_fail_percentage 34139 1726867644.99839: done checking for max_fail_percentage 34139 1726867644.99840: checking to see if all hosts have failed and the running result is not ok 34139 1726867644.99841: done checking to see if all hosts have failed 34139 1726867644.99841: getting the remaining hosts for this loop 34139 1726867644.99842: done getting the remaining hosts for this loop 34139 1726867644.99845: getting the next task for host managed_node1 34139 1726867644.99849: done getting next task for host managed_node1 34139 1726867644.99852: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34139 1726867644.99855: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867644.99867: getting variables 34139 1726867644.99868: in VariableManager get_vars() 34139 1726867644.99898: Calling all_inventory to load vars for managed_node1 34139 1726867644.99900: Calling groups_inventory to load vars for managed_node1 34139 1726867644.99902: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867644.99909: Calling all_plugins_play to load vars for managed_node1 34139 1726867644.99911: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867644.99913: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.00021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.00151: done with get_vars() 34139 1726867645.00158: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:27:25 -0400 (0:00:00.013) 0:00:03.748 ****** 34139 1726867645.00221: entering _queue_task() for managed_node1/ping 34139 1726867645.00378: worker is 1 (out of 1 available) 34139 1726867645.00392: exiting _queue_task() for managed_node1/ping 34139 1726867645.00403: done queuing things up, now waiting for results queue to drain 34139 1726867645.00404: waiting for pending results... 34139 1726867645.00538: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 34139 1726867645.00619: in run() - task 0affcac9-a3a5-c103-b8fd-000000000078 34139 1726867645.00636: variable 'ansible_search_path' from source: unknown 34139 1726867645.00639: variable 'ansible_search_path' from source: unknown 34139 1726867645.00663: calling self._execute() 34139 1726867645.00727: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.00730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.00744: variable 'omit' from source: magic vars 34139 1726867645.01001: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.01009: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.01087: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.01091: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.01094: when evaluation is False, skipping this task 34139 1726867645.01097: _execute() done 34139 1726867645.01100: dumping result to json 34139 1726867645.01103: done dumping result, returning 34139 1726867645.01110: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-c103-b8fd-000000000078] 34139 1726867645.01117: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000078 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.01231: no more pending results, returning what we have 34139 1726867645.01234: results queue empty 34139 1726867645.01235: checking for any_errors_fatal 34139 1726867645.01240: done checking for any_errors_fatal 34139 1726867645.01241: checking for max_fail_percentage 34139 1726867645.01242: done checking for max_fail_percentage 34139 1726867645.01243: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.01244: done checking to see if all hosts have failed 34139 1726867645.01245: getting the remaining hosts for this loop 34139 1726867645.01246: done getting the remaining hosts for this loop 34139 1726867645.01249: getting the next task for host managed_node1 34139 1726867645.01258: done getting next task for host managed_node1 34139 1726867645.01259: ^ task is: TASK: meta (role_complete) 34139 1726867645.01262: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.01275: getting variables 34139 1726867645.01279: in VariableManager get_vars() 34139 1726867645.01314: Calling all_inventory to load vars for managed_node1 34139 1726867645.01317: Calling groups_inventory to load vars for managed_node1 34139 1726867645.01319: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.01326: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.01328: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.01331: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.01467: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000078 34139 1726867645.01470: WORKER PROCESS EXITING 34139 1726867645.01482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.01603: done with get_vars() 34139 1726867645.01611: done getting variables 34139 1726867645.01660: done queuing things up, now waiting for results queue to drain 34139 1726867645.01661: results queue empty 34139 1726867645.01662: checking for any_errors_fatal 34139 1726867645.01664: done checking for any_errors_fatal 34139 1726867645.01664: checking for max_fail_percentage 34139 1726867645.01665: done checking for max_fail_percentage 34139 1726867645.01665: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.01666: done checking to see if all hosts have failed 34139 1726867645.01666: getting the remaining hosts for this loop 34139 1726867645.01667: done getting the remaining hosts for this loop 34139 1726867645.01668: getting the next task for host managed_node1 34139 1726867645.01670: done getting next task for host managed_node1 34139 1726867645.01671: ^ task is: TASK: TEST: wireless connection with 802.1x TLS-EAP 34139 1726867645.01672: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.01674: getting variables 34139 1726867645.01674: in VariableManager get_vars() 34139 1726867645.01686: Calling all_inventory to load vars for managed_node1 34139 1726867645.01688: Calling groups_inventory to load vars for managed_node1 34139 1726867645.01689: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.01691: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.01693: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.01694: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.01776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.01893: done with get_vars() 34139 1726867645.01899: done getting variables 34139 1726867645.01923: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: wireless connection with 802.1x TLS-EAP] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:53 Friday 20 September 2024 17:27:25 -0400 (0:00:00.017) 0:00:03.765 ****** 34139 1726867645.01940: entering _queue_task() for managed_node1/debug 34139 1726867645.02097: worker is 1 (out of 1 available) 34139 1726867645.02109: exiting _queue_task() for managed_node1/debug 34139 1726867645.02119: done queuing things up, now waiting for results queue to drain 34139 1726867645.02121: waiting for pending results... 34139 1726867645.02261: running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with 802.1x TLS-EAP 34139 1726867645.02311: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000a8 34139 1726867645.02323: variable 'ansible_search_path' from source: unknown 34139 1726867645.02353: calling self._execute() 34139 1726867645.02407: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.02413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.02421: variable 'omit' from source: magic vars 34139 1726867645.02658: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.02668: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.02746: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.02750: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.02753: when evaluation is False, skipping this task 34139 1726867645.02756: _execute() done 34139 1726867645.02758: dumping result to json 34139 1726867645.02761: done dumping result, returning 34139 1726867645.02767: done running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with 802.1x TLS-EAP [0affcac9-a3a5-c103-b8fd-0000000000a8] 34139 1726867645.02771: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000a8 34139 1726867645.02849: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000a8 34139 1726867645.02853: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867645.02919: no more pending results, returning what we have 34139 1726867645.02922: results queue empty 34139 1726867645.02922: checking for any_errors_fatal 34139 1726867645.02924: done checking for any_errors_fatal 34139 1726867645.02925: checking for max_fail_percentage 34139 1726867645.02926: done checking for max_fail_percentage 34139 1726867645.02927: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.02927: done checking to see if all hosts have failed 34139 1726867645.02928: getting the remaining hosts for this loop 34139 1726867645.02929: done getting the remaining hosts for this loop 34139 1726867645.02931: getting the next task for host managed_node1 34139 1726867645.02936: done getting next task for host managed_node1 34139 1726867645.02940: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34139 1726867645.02943: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.02955: getting variables 34139 1726867645.02956: in VariableManager get_vars() 34139 1726867645.02984: Calling all_inventory to load vars for managed_node1 34139 1726867645.02986: Calling groups_inventory to load vars for managed_node1 34139 1726867645.02987: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.02992: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.02994: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.02995: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.03130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.03254: done with get_vars() 34139 1726867645.03261: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:27:25 -0400 (0:00:00.013) 0:00:03.779 ****** 34139 1726867645.03320: entering _queue_task() for managed_node1/include_tasks 34139 1726867645.03474: worker is 1 (out of 1 available) 34139 1726867645.03489: exiting _queue_task() for managed_node1/include_tasks 34139 1726867645.03499: done queuing things up, now waiting for results queue to drain 34139 1726867645.03501: waiting for pending results... 34139 1726867645.03638: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34139 1726867645.03704: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000b0 34139 1726867645.03717: variable 'ansible_search_path' from source: unknown 34139 1726867645.03723: variable 'ansible_search_path' from source: unknown 34139 1726867645.03747: calling self._execute() 34139 1726867645.03798: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.03801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.03810: variable 'omit' from source: magic vars 34139 1726867645.04046: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.04055: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.04136: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.04139: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.04142: when evaluation is False, skipping this task 34139 1726867645.04146: _execute() done 34139 1726867645.04148: dumping result to json 34139 1726867645.04153: done dumping result, returning 34139 1726867645.04159: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-c103-b8fd-0000000000b0] 34139 1726867645.04163: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b0 34139 1726867645.04239: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b0 34139 1726867645.04242: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.04306: no more pending results, returning what we have 34139 1726867645.04308: results queue empty 34139 1726867645.04309: checking for any_errors_fatal 34139 1726867645.04315: done checking for any_errors_fatal 34139 1726867645.04315: checking for max_fail_percentage 34139 1726867645.04317: done checking for max_fail_percentage 34139 1726867645.04318: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.04319: done checking to see if all hosts have failed 34139 1726867645.04319: getting the remaining hosts for this loop 34139 1726867645.04320: done getting the remaining hosts for this loop 34139 1726867645.04323: getting the next task for host managed_node1 34139 1726867645.04327: done getting next task for host managed_node1 34139 1726867645.04330: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34139 1726867645.04333: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.04345: getting variables 34139 1726867645.04346: in VariableManager get_vars() 34139 1726867645.04372: Calling all_inventory to load vars for managed_node1 34139 1726867645.04373: Calling groups_inventory to load vars for managed_node1 34139 1726867645.04374: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.04381: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.04383: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.04385: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.04492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.04620: done with get_vars() 34139 1726867645.04627: done getting variables 34139 1726867645.04662: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:27:25 -0400 (0:00:00.013) 0:00:03.792 ****** 34139 1726867645.04685: entering _queue_task() for managed_node1/debug 34139 1726867645.04842: worker is 1 (out of 1 available) 34139 1726867645.04854: exiting _queue_task() for managed_node1/debug 34139 1726867645.04864: done queuing things up, now waiting for results queue to drain 34139 1726867645.04866: waiting for pending results... 34139 1726867645.05008: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 34139 1726867645.05072: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000b1 34139 1726867645.05084: variable 'ansible_search_path' from source: unknown 34139 1726867645.05088: variable 'ansible_search_path' from source: unknown 34139 1726867645.05116: calling self._execute() 34139 1726867645.05167: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.05170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.05180: variable 'omit' from source: magic vars 34139 1726867645.05409: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.05422: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.05496: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.05500: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.05503: when evaluation is False, skipping this task 34139 1726867645.05505: _execute() done 34139 1726867645.05508: dumping result to json 34139 1726867645.05514: done dumping result, returning 34139 1726867645.05521: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-c103-b8fd-0000000000b1] 34139 1726867645.05525: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b1 34139 1726867645.05603: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b1 34139 1726867645.05606: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867645.05668: no more pending results, returning what we have 34139 1726867645.05671: results queue empty 34139 1726867645.05671: checking for any_errors_fatal 34139 1726867645.05675: done checking for any_errors_fatal 34139 1726867645.05676: checking for max_fail_percentage 34139 1726867645.05684: done checking for max_fail_percentage 34139 1726867645.05685: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.05686: done checking to see if all hosts have failed 34139 1726867645.05686: getting the remaining hosts for this loop 34139 1726867645.05687: done getting the remaining hosts for this loop 34139 1726867645.05690: getting the next task for host managed_node1 34139 1726867645.05694: done getting next task for host managed_node1 34139 1726867645.05698: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34139 1726867645.05699: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.05710: getting variables 34139 1726867645.05711: in VariableManager get_vars() 34139 1726867645.05736: Calling all_inventory to load vars for managed_node1 34139 1726867645.05737: Calling groups_inventory to load vars for managed_node1 34139 1726867645.05739: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.05744: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.05745: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.05747: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.05885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.06012: done with get_vars() 34139 1726867645.06018: done getting variables 34139 1726867645.06054: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:27:25 -0400 (0:00:00.013) 0:00:03.806 ****** 34139 1726867645.06072: entering _queue_task() for managed_node1/fail 34139 1726867645.06230: worker is 1 (out of 1 available) 34139 1726867645.06243: exiting _queue_task() for managed_node1/fail 34139 1726867645.06254: done queuing things up, now waiting for results queue to drain 34139 1726867645.06256: waiting for pending results... 34139 1726867645.06397: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34139 1726867645.06468: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000b2 34139 1726867645.06483: variable 'ansible_search_path' from source: unknown 34139 1726867645.06487: variable 'ansible_search_path' from source: unknown 34139 1726867645.06509: calling self._execute() 34139 1726867645.06560: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.06563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.06572: variable 'omit' from source: magic vars 34139 1726867645.06805: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.06821: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.06890: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.06894: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.06897: when evaluation is False, skipping this task 34139 1726867645.06900: _execute() done 34139 1726867645.06903: dumping result to json 34139 1726867645.06907: done dumping result, returning 34139 1726867645.06917: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-c103-b8fd-0000000000b2] 34139 1726867645.06920: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b2 34139 1726867645.06998: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b2 34139 1726867645.07001: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.07064: no more pending results, returning what we have 34139 1726867645.07067: results queue empty 34139 1726867645.07067: checking for any_errors_fatal 34139 1726867645.07071: done checking for any_errors_fatal 34139 1726867645.07072: checking for max_fail_percentage 34139 1726867645.07073: done checking for max_fail_percentage 34139 1726867645.07074: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.07075: done checking to see if all hosts have failed 34139 1726867645.07076: getting the remaining hosts for this loop 34139 1726867645.07076: done getting the remaining hosts for this loop 34139 1726867645.07081: getting the next task for host managed_node1 34139 1726867645.07086: done getting next task for host managed_node1 34139 1726867645.07089: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34139 1726867645.07091: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.07103: getting variables 34139 1726867645.07104: in VariableManager get_vars() 34139 1726867645.07130: Calling all_inventory to load vars for managed_node1 34139 1726867645.07132: Calling groups_inventory to load vars for managed_node1 34139 1726867645.07133: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.07139: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.07140: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.07142: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.07249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.07375: done with get_vars() 34139 1726867645.07384: done getting variables 34139 1726867645.07421: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:27:25 -0400 (0:00:00.013) 0:00:03.820 ****** 34139 1726867645.07442: entering _queue_task() for managed_node1/fail 34139 1726867645.07605: worker is 1 (out of 1 available) 34139 1726867645.07618: exiting _queue_task() for managed_node1/fail 34139 1726867645.07629: done queuing things up, now waiting for results queue to drain 34139 1726867645.07630: waiting for pending results... 34139 1726867645.07780: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34139 1726867645.07852: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000b3 34139 1726867645.07865: variable 'ansible_search_path' from source: unknown 34139 1726867645.07869: variable 'ansible_search_path' from source: unknown 34139 1726867645.07892: calling self._execute() 34139 1726867645.07948: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.07952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.07960: variable 'omit' from source: magic vars 34139 1726867645.08205: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.08217: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.08292: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.08297: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.08300: when evaluation is False, skipping this task 34139 1726867645.08303: _execute() done 34139 1726867645.08305: dumping result to json 34139 1726867645.08308: done dumping result, returning 34139 1726867645.08320: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-c103-b8fd-0000000000b3] 34139 1726867645.08323: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b3 34139 1726867645.08401: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b3 34139 1726867645.08404: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.08461: no more pending results, returning what we have 34139 1726867645.08464: results queue empty 34139 1726867645.08465: checking for any_errors_fatal 34139 1726867645.08469: done checking for any_errors_fatal 34139 1726867645.08470: checking for max_fail_percentage 34139 1726867645.08471: done checking for max_fail_percentage 34139 1726867645.08472: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.08472: done checking to see if all hosts have failed 34139 1726867645.08473: getting the remaining hosts for this loop 34139 1726867645.08474: done getting the remaining hosts for this loop 34139 1726867645.08479: getting the next task for host managed_node1 34139 1726867645.08483: done getting next task for host managed_node1 34139 1726867645.08486: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34139 1726867645.08489: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.08503: getting variables 34139 1726867645.08504: in VariableManager get_vars() 34139 1726867645.08539: Calling all_inventory to load vars for managed_node1 34139 1726867645.08541: Calling groups_inventory to load vars for managed_node1 34139 1726867645.08542: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.08548: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.08550: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.08551: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.08692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.08816: done with get_vars() 34139 1726867645.08823: done getting variables 34139 1726867645.08862: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:27:25 -0400 (0:00:00.014) 0:00:03.834 ****** 34139 1726867645.08883: entering _queue_task() for managed_node1/fail 34139 1726867645.09047: worker is 1 (out of 1 available) 34139 1726867645.09059: exiting _queue_task() for managed_node1/fail 34139 1726867645.09070: done queuing things up, now waiting for results queue to drain 34139 1726867645.09072: waiting for pending results... 34139 1726867645.09232: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34139 1726867645.09306: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000b4 34139 1726867645.09315: variable 'ansible_search_path' from source: unknown 34139 1726867645.09319: variable 'ansible_search_path' from source: unknown 34139 1726867645.09346: calling self._execute() 34139 1726867645.09416: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.09419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.09422: variable 'omit' from source: magic vars 34139 1726867645.09669: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.09680: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.09756: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.09760: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.09763: when evaluation is False, skipping this task 34139 1726867645.09766: _execute() done 34139 1726867645.09769: dumping result to json 34139 1726867645.09772: done dumping result, returning 34139 1726867645.09781: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-c103-b8fd-0000000000b4] 34139 1726867645.09784: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b4 34139 1726867645.09866: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b4 34139 1726867645.09869: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.09920: no more pending results, returning what we have 34139 1726867645.09923: results queue empty 34139 1726867645.09924: checking for any_errors_fatal 34139 1726867645.09928: done checking for any_errors_fatal 34139 1726867645.09929: checking for max_fail_percentage 34139 1726867645.09930: done checking for max_fail_percentage 34139 1726867645.09931: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.09932: done checking to see if all hosts have failed 34139 1726867645.09932: getting the remaining hosts for this loop 34139 1726867645.09933: done getting the remaining hosts for this loop 34139 1726867645.09936: getting the next task for host managed_node1 34139 1726867645.09941: done getting next task for host managed_node1 34139 1726867645.09944: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34139 1726867645.09947: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.09961: getting variables 34139 1726867645.09962: in VariableManager get_vars() 34139 1726867645.09999: Calling all_inventory to load vars for managed_node1 34139 1726867645.10000: Calling groups_inventory to load vars for managed_node1 34139 1726867645.10002: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.10010: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.10012: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.10013: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.10124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.10251: done with get_vars() 34139 1726867645.10258: done getting variables 34139 1726867645.10296: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:27:25 -0400 (0:00:00.014) 0:00:03.849 ****** 34139 1726867645.10319: entering _queue_task() for managed_node1/dnf 34139 1726867645.10484: worker is 1 (out of 1 available) 34139 1726867645.10496: exiting _queue_task() for managed_node1/dnf 34139 1726867645.10506: done queuing things up, now waiting for results queue to drain 34139 1726867645.10511: waiting for pending results... 34139 1726867645.10653: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34139 1726867645.10722: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000b5 34139 1726867645.10733: variable 'ansible_search_path' from source: unknown 34139 1726867645.10736: variable 'ansible_search_path' from source: unknown 34139 1726867645.10762: calling self._execute() 34139 1726867645.10817: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.10821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.10830: variable 'omit' from source: magic vars 34139 1726867645.11136: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.11144: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.11226: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.11229: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.11232: when evaluation is False, skipping this task 34139 1726867645.11236: _execute() done 34139 1726867645.11238: dumping result to json 34139 1726867645.11242: done dumping result, returning 34139 1726867645.11249: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-c103-b8fd-0000000000b5] 34139 1726867645.11254: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b5 34139 1726867645.11341: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b5 34139 1726867645.11343: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.11390: no more pending results, returning what we have 34139 1726867645.11392: results queue empty 34139 1726867645.11393: checking for any_errors_fatal 34139 1726867645.11397: done checking for any_errors_fatal 34139 1726867645.11398: checking for max_fail_percentage 34139 1726867645.11399: done checking for max_fail_percentage 34139 1726867645.11400: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.11401: done checking to see if all hosts have failed 34139 1726867645.11401: getting the remaining hosts for this loop 34139 1726867645.11403: done getting the remaining hosts for this loop 34139 1726867645.11405: getting the next task for host managed_node1 34139 1726867645.11410: done getting next task for host managed_node1 34139 1726867645.11413: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34139 1726867645.11415: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.11430: getting variables 34139 1726867645.11431: in VariableManager get_vars() 34139 1726867645.11465: Calling all_inventory to load vars for managed_node1 34139 1726867645.11467: Calling groups_inventory to load vars for managed_node1 34139 1726867645.11470: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.11476: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.11480: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.11482: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.11616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.11741: done with get_vars() 34139 1726867645.11748: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34139 1726867645.11801: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:27:25 -0400 (0:00:00.015) 0:00:03.864 ****** 34139 1726867645.11820: entering _queue_task() for managed_node1/yum 34139 1726867645.11986: worker is 1 (out of 1 available) 34139 1726867645.11997: exiting _queue_task() for managed_node1/yum 34139 1726867645.12007: done queuing things up, now waiting for results queue to drain 34139 1726867645.12009: waiting for pending results... 34139 1726867645.12162: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34139 1726867645.12243: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000b6 34139 1726867645.12247: variable 'ansible_search_path' from source: unknown 34139 1726867645.12250: variable 'ansible_search_path' from source: unknown 34139 1726867645.12276: calling self._execute() 34139 1726867645.12335: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.12340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.12352: variable 'omit' from source: magic vars 34139 1726867645.12593: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.12602: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.12682: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.12686: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.12689: when evaluation is False, skipping this task 34139 1726867645.12691: _execute() done 34139 1726867645.12693: dumping result to json 34139 1726867645.12697: done dumping result, returning 34139 1726867645.12704: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-c103-b8fd-0000000000b6] 34139 1726867645.12708: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b6 34139 1726867645.12793: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b6 34139 1726867645.12796: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.12843: no more pending results, returning what we have 34139 1726867645.12846: results queue empty 34139 1726867645.12847: checking for any_errors_fatal 34139 1726867645.12851: done checking for any_errors_fatal 34139 1726867645.12853: checking for max_fail_percentage 34139 1726867645.12854: done checking for max_fail_percentage 34139 1726867645.12855: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.12856: done checking to see if all hosts have failed 34139 1726867645.12856: getting the remaining hosts for this loop 34139 1726867645.12858: done getting the remaining hosts for this loop 34139 1726867645.12860: getting the next task for host managed_node1 34139 1726867645.12865: done getting next task for host managed_node1 34139 1726867645.12868: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34139 1726867645.12871: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.12887: getting variables 34139 1726867645.12889: in VariableManager get_vars() 34139 1726867645.12924: Calling all_inventory to load vars for managed_node1 34139 1726867645.12927: Calling groups_inventory to load vars for managed_node1 34139 1726867645.12929: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.12935: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.12937: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.12939: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.13049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.13196: done with get_vars() 34139 1726867645.13203: done getting variables 34139 1726867645.13239: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:27:25 -0400 (0:00:00.014) 0:00:03.878 ****** 34139 1726867645.13261: entering _queue_task() for managed_node1/fail 34139 1726867645.13435: worker is 1 (out of 1 available) 34139 1726867645.13447: exiting _queue_task() for managed_node1/fail 34139 1726867645.13459: done queuing things up, now waiting for results queue to drain 34139 1726867645.13461: waiting for pending results... 34139 1726867645.13617: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34139 1726867645.13696: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000b7 34139 1726867645.13703: variable 'ansible_search_path' from source: unknown 34139 1726867645.13706: variable 'ansible_search_path' from source: unknown 34139 1726867645.13735: calling self._execute() 34139 1726867645.13792: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.13797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.13807: variable 'omit' from source: magic vars 34139 1726867645.14053: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.14062: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.14142: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.14146: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.14149: when evaluation is False, skipping this task 34139 1726867645.14153: _execute() done 34139 1726867645.14156: dumping result to json 34139 1726867645.14160: done dumping result, returning 34139 1726867645.14167: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-c103-b8fd-0000000000b7] 34139 1726867645.14172: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b7 34139 1726867645.14257: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b7 34139 1726867645.14260: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.14310: no more pending results, returning what we have 34139 1726867645.14313: results queue empty 34139 1726867645.14314: checking for any_errors_fatal 34139 1726867645.14318: done checking for any_errors_fatal 34139 1726867645.14318: checking for max_fail_percentage 34139 1726867645.14320: done checking for max_fail_percentage 34139 1726867645.14321: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.14321: done checking to see if all hosts have failed 34139 1726867645.14322: getting the remaining hosts for this loop 34139 1726867645.14323: done getting the remaining hosts for this loop 34139 1726867645.14326: getting the next task for host managed_node1 34139 1726867645.14330: done getting next task for host managed_node1 34139 1726867645.14334: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34139 1726867645.14336: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.14351: getting variables 34139 1726867645.14352: in VariableManager get_vars() 34139 1726867645.14393: Calling all_inventory to load vars for managed_node1 34139 1726867645.14395: Calling groups_inventory to load vars for managed_node1 34139 1726867645.14397: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.14402: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.14404: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.14406: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.14521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.14648: done with get_vars() 34139 1726867645.14656: done getting variables 34139 1726867645.14695: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:27:25 -0400 (0:00:00.014) 0:00:03.893 ****** 34139 1726867645.14718: entering _queue_task() for managed_node1/package 34139 1726867645.14891: worker is 1 (out of 1 available) 34139 1726867645.14903: exiting _queue_task() for managed_node1/package 34139 1726867645.14915: done queuing things up, now waiting for results queue to drain 34139 1726867645.14916: waiting for pending results... 34139 1726867645.15069: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 34139 1726867645.15148: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000b8 34139 1726867645.15183: variable 'ansible_search_path' from source: unknown 34139 1726867645.15186: variable 'ansible_search_path' from source: unknown 34139 1726867645.15190: calling self._execute() 34139 1726867645.15250: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.15254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.15259: variable 'omit' from source: magic vars 34139 1726867645.15504: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.15516: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.15590: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.15594: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.15596: when evaluation is False, skipping this task 34139 1726867645.15599: _execute() done 34139 1726867645.15601: dumping result to json 34139 1726867645.15604: done dumping result, returning 34139 1726867645.15614: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-c103-b8fd-0000000000b8] 34139 1726867645.15619: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b8 34139 1726867645.15704: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b8 34139 1726867645.15707: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.15762: no more pending results, returning what we have 34139 1726867645.15765: results queue empty 34139 1726867645.15765: checking for any_errors_fatal 34139 1726867645.15771: done checking for any_errors_fatal 34139 1726867645.15771: checking for max_fail_percentage 34139 1726867645.15773: done checking for max_fail_percentage 34139 1726867645.15773: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.15774: done checking to see if all hosts have failed 34139 1726867645.15775: getting the remaining hosts for this loop 34139 1726867645.15776: done getting the remaining hosts for this loop 34139 1726867645.15781: getting the next task for host managed_node1 34139 1726867645.15786: done getting next task for host managed_node1 34139 1726867645.15788: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34139 1726867645.15791: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.15806: getting variables 34139 1726867645.15807: in VariableManager get_vars() 34139 1726867645.15840: Calling all_inventory to load vars for managed_node1 34139 1726867645.15842: Calling groups_inventory to load vars for managed_node1 34139 1726867645.15843: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.15849: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.15850: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.15852: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.15989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.16116: done with get_vars() 34139 1726867645.16123: done getting variables 34139 1726867645.16163: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:27:25 -0400 (0:00:00.014) 0:00:03.907 ****** 34139 1726867645.16186: entering _queue_task() for managed_node1/package 34139 1726867645.16365: worker is 1 (out of 1 available) 34139 1726867645.16379: exiting _queue_task() for managed_node1/package 34139 1726867645.16390: done queuing things up, now waiting for results queue to drain 34139 1726867645.16392: waiting for pending results... 34139 1726867645.16560: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34139 1726867645.16642: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000b9 34139 1726867645.16652: variable 'ansible_search_path' from source: unknown 34139 1726867645.16656: variable 'ansible_search_path' from source: unknown 34139 1726867645.16683: calling self._execute() 34139 1726867645.16746: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.16749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.16759: variable 'omit' from source: magic vars 34139 1726867645.17025: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.17034: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.17118: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.17122: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.17125: when evaluation is False, skipping this task 34139 1726867645.17128: _execute() done 34139 1726867645.17130: dumping result to json 34139 1726867645.17135: done dumping result, returning 34139 1726867645.17149: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-c103-b8fd-0000000000b9] 34139 1726867645.17153: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b9 34139 1726867645.17236: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000b9 34139 1726867645.17239: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.17304: no more pending results, returning what we have 34139 1726867645.17307: results queue empty 34139 1726867645.17307: checking for any_errors_fatal 34139 1726867645.17312: done checking for any_errors_fatal 34139 1726867645.17312: checking for max_fail_percentage 34139 1726867645.17314: done checking for max_fail_percentage 34139 1726867645.17315: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.17315: done checking to see if all hosts have failed 34139 1726867645.17316: getting the remaining hosts for this loop 34139 1726867645.17317: done getting the remaining hosts for this loop 34139 1726867645.17320: getting the next task for host managed_node1 34139 1726867645.17325: done getting next task for host managed_node1 34139 1726867645.17328: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34139 1726867645.17330: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.17346: getting variables 34139 1726867645.17347: in VariableManager get_vars() 34139 1726867645.17384: Calling all_inventory to load vars for managed_node1 34139 1726867645.17386: Calling groups_inventory to load vars for managed_node1 34139 1726867645.17388: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.17393: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.17395: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.17397: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.17511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.17637: done with get_vars() 34139 1726867645.17644: done getting variables 34139 1726867645.17685: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:27:25 -0400 (0:00:00.015) 0:00:03.923 ****** 34139 1726867645.17708: entering _queue_task() for managed_node1/package 34139 1726867645.17890: worker is 1 (out of 1 available) 34139 1726867645.17905: exiting _queue_task() for managed_node1/package 34139 1726867645.17915: done queuing things up, now waiting for results queue to drain 34139 1726867645.17917: waiting for pending results... 34139 1726867645.18080: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34139 1726867645.18159: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000ba 34139 1726867645.18171: variable 'ansible_search_path' from source: unknown 34139 1726867645.18174: variable 'ansible_search_path' from source: unknown 34139 1726867645.18204: calling self._execute() 34139 1726867645.18265: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.18269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.18280: variable 'omit' from source: magic vars 34139 1726867645.18533: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.18542: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.18621: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.18625: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.18628: when evaluation is False, skipping this task 34139 1726867645.18631: _execute() done 34139 1726867645.18634: dumping result to json 34139 1726867645.18638: done dumping result, returning 34139 1726867645.18646: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-c103-b8fd-0000000000ba] 34139 1726867645.18650: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000ba 34139 1726867645.18736: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000ba 34139 1726867645.18739: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.18787: no more pending results, returning what we have 34139 1726867645.18790: results queue empty 34139 1726867645.18791: checking for any_errors_fatal 34139 1726867645.18796: done checking for any_errors_fatal 34139 1726867645.18797: checking for max_fail_percentage 34139 1726867645.18798: done checking for max_fail_percentage 34139 1726867645.18799: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.18800: done checking to see if all hosts have failed 34139 1726867645.18801: getting the remaining hosts for this loop 34139 1726867645.18802: done getting the remaining hosts for this loop 34139 1726867645.18805: getting the next task for host managed_node1 34139 1726867645.18809: done getting next task for host managed_node1 34139 1726867645.18812: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34139 1726867645.18815: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.18831: getting variables 34139 1726867645.18833: in VariableManager get_vars() 34139 1726867645.18870: Calling all_inventory to load vars for managed_node1 34139 1726867645.18873: Calling groups_inventory to load vars for managed_node1 34139 1726867645.18875: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.18885: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.18887: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.18888: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.19029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.19161: done with get_vars() 34139 1726867645.19168: done getting variables 34139 1726867645.19215: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:27:25 -0400 (0:00:00.015) 0:00:03.938 ****** 34139 1726867645.19237: entering _queue_task() for managed_node1/service 34139 1726867645.19424: worker is 1 (out of 1 available) 34139 1726867645.19439: exiting _queue_task() for managed_node1/service 34139 1726867645.19450: done queuing things up, now waiting for results queue to drain 34139 1726867645.19452: waiting for pending results... 34139 1726867645.19624: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34139 1726867645.19701: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000bb 34139 1726867645.19784: variable 'ansible_search_path' from source: unknown 34139 1726867645.19790: variable 'ansible_search_path' from source: unknown 34139 1726867645.19792: calling self._execute() 34139 1726867645.19815: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.19819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.19828: variable 'omit' from source: magic vars 34139 1726867645.20089: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.20099: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.20176: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.20182: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.20185: when evaluation is False, skipping this task 34139 1726867645.20187: _execute() done 34139 1726867645.20191: dumping result to json 34139 1726867645.20195: done dumping result, returning 34139 1726867645.20202: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-c103-b8fd-0000000000bb] 34139 1726867645.20207: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000bb skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.20342: no more pending results, returning what we have 34139 1726867645.20345: results queue empty 34139 1726867645.20346: checking for any_errors_fatal 34139 1726867645.20351: done checking for any_errors_fatal 34139 1726867645.20352: checking for max_fail_percentage 34139 1726867645.20353: done checking for max_fail_percentage 34139 1726867645.20354: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.20354: done checking to see if all hosts have failed 34139 1726867645.20355: getting the remaining hosts for this loop 34139 1726867645.20356: done getting the remaining hosts for this loop 34139 1726867645.20359: getting the next task for host managed_node1 34139 1726867645.20364: done getting next task for host managed_node1 34139 1726867645.20367: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34139 1726867645.20370: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.20389: getting variables 34139 1726867645.20390: in VariableManager get_vars() 34139 1726867645.20428: Calling all_inventory to load vars for managed_node1 34139 1726867645.20431: Calling groups_inventory to load vars for managed_node1 34139 1726867645.20433: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.20441: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.20443: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.20446: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.20559: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000bb 34139 1726867645.20563: WORKER PROCESS EXITING 34139 1726867645.20572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.20701: done with get_vars() 34139 1726867645.20710: done getting variables 34139 1726867645.20748: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:27:25 -0400 (0:00:00.015) 0:00:03.953 ****** 34139 1726867645.20770: entering _queue_task() for managed_node1/service 34139 1726867645.20954: worker is 1 (out of 1 available) 34139 1726867645.20967: exiting _queue_task() for managed_node1/service 34139 1726867645.20981: done queuing things up, now waiting for results queue to drain 34139 1726867645.20983: waiting for pending results... 34139 1726867645.21141: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34139 1726867645.21217: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000bc 34139 1726867645.21228: variable 'ansible_search_path' from source: unknown 34139 1726867645.21231: variable 'ansible_search_path' from source: unknown 34139 1726867645.21260: calling self._execute() 34139 1726867645.21324: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.21327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.21336: variable 'omit' from source: magic vars 34139 1726867645.21646: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.21656: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.21729: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.21733: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.21736: when evaluation is False, skipping this task 34139 1726867645.21739: _execute() done 34139 1726867645.21742: dumping result to json 34139 1726867645.21744: done dumping result, returning 34139 1726867645.21753: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-c103-b8fd-0000000000bc] 34139 1726867645.21757: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000bc 34139 1726867645.21843: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000bc 34139 1726867645.21846: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34139 1726867645.21897: no more pending results, returning what we have 34139 1726867645.21899: results queue empty 34139 1726867645.21900: checking for any_errors_fatal 34139 1726867645.21906: done checking for any_errors_fatal 34139 1726867645.21909: checking for max_fail_percentage 34139 1726867645.21911: done checking for max_fail_percentage 34139 1726867645.21911: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.21912: done checking to see if all hosts have failed 34139 1726867645.21913: getting the remaining hosts for this loop 34139 1726867645.21914: done getting the remaining hosts for this loop 34139 1726867645.21917: getting the next task for host managed_node1 34139 1726867645.21923: done getting next task for host managed_node1 34139 1726867645.21926: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34139 1726867645.21928: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.21945: getting variables 34139 1726867645.21946: in VariableManager get_vars() 34139 1726867645.21984: Calling all_inventory to load vars for managed_node1 34139 1726867645.21987: Calling groups_inventory to load vars for managed_node1 34139 1726867645.21989: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.21997: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.21999: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.22001: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.22146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.22274: done with get_vars() 34139 1726867645.22283: done getting variables 34139 1726867645.22329: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:27:25 -0400 (0:00:00.015) 0:00:03.969 ****** 34139 1726867645.22350: entering _queue_task() for managed_node1/service 34139 1726867645.22546: worker is 1 (out of 1 available) 34139 1726867645.22561: exiting _queue_task() for managed_node1/service 34139 1726867645.22572: done queuing things up, now waiting for results queue to drain 34139 1726867645.22574: waiting for pending results... 34139 1726867645.22735: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34139 1726867645.22882: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000bd 34139 1726867645.22885: variable 'ansible_search_path' from source: unknown 34139 1726867645.22887: variable 'ansible_search_path' from source: unknown 34139 1726867645.22889: calling self._execute() 34139 1726867645.22927: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.22932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.22942: variable 'omit' from source: magic vars 34139 1726867645.23198: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.23210: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.23287: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.23291: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.23293: when evaluation is False, skipping this task 34139 1726867645.23298: _execute() done 34139 1726867645.23301: dumping result to json 34139 1726867645.23304: done dumping result, returning 34139 1726867645.23313: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-c103-b8fd-0000000000bd] 34139 1726867645.23315: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000bd 34139 1726867645.23404: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000bd 34139 1726867645.23409: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.23457: no more pending results, returning what we have 34139 1726867645.23460: results queue empty 34139 1726867645.23461: checking for any_errors_fatal 34139 1726867645.23466: done checking for any_errors_fatal 34139 1726867645.23467: checking for max_fail_percentage 34139 1726867645.23468: done checking for max_fail_percentage 34139 1726867645.23469: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.23469: done checking to see if all hosts have failed 34139 1726867645.23470: getting the remaining hosts for this loop 34139 1726867645.23471: done getting the remaining hosts for this loop 34139 1726867645.23474: getting the next task for host managed_node1 34139 1726867645.23481: done getting next task for host managed_node1 34139 1726867645.23484: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34139 1726867645.23486: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.23502: getting variables 34139 1726867645.23503: in VariableManager get_vars() 34139 1726867645.23542: Calling all_inventory to load vars for managed_node1 34139 1726867645.23545: Calling groups_inventory to load vars for managed_node1 34139 1726867645.23546: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.23553: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.23554: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.23556: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.23672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.23824: done with get_vars() 34139 1726867645.23831: done getting variables 34139 1726867645.23872: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:27:25 -0400 (0:00:00.015) 0:00:03.985 ****** 34139 1726867645.23894: entering _queue_task() for managed_node1/service 34139 1726867645.24083: worker is 1 (out of 1 available) 34139 1726867645.24097: exiting _queue_task() for managed_node1/service 34139 1726867645.24110: done queuing things up, now waiting for results queue to drain 34139 1726867645.24112: waiting for pending results... 34139 1726867645.24270: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 34139 1726867645.24348: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000be 34139 1726867645.24358: variable 'ansible_search_path' from source: unknown 34139 1726867645.24362: variable 'ansible_search_path' from source: unknown 34139 1726867645.24391: calling self._execute() 34139 1726867645.24448: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.24459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.24468: variable 'omit' from source: magic vars 34139 1726867645.24737: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.24746: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.24827: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.24831: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.24834: when evaluation is False, skipping this task 34139 1726867645.24836: _execute() done 34139 1726867645.24839: dumping result to json 34139 1726867645.24841: done dumping result, returning 34139 1726867645.24849: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-c103-b8fd-0000000000be] 34139 1726867645.24853: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000be 34139 1726867645.24932: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000be 34139 1726867645.24936: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34139 1726867645.24975: no more pending results, returning what we have 34139 1726867645.24980: results queue empty 34139 1726867645.24981: checking for any_errors_fatal 34139 1726867645.24988: done checking for any_errors_fatal 34139 1726867645.24989: checking for max_fail_percentage 34139 1726867645.24990: done checking for max_fail_percentage 34139 1726867645.24991: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.24992: done checking to see if all hosts have failed 34139 1726867645.24993: getting the remaining hosts for this loop 34139 1726867645.24994: done getting the remaining hosts for this loop 34139 1726867645.24997: getting the next task for host managed_node1 34139 1726867645.25003: done getting next task for host managed_node1 34139 1726867645.25006: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34139 1726867645.25009: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.25025: getting variables 34139 1726867645.25026: in VariableManager get_vars() 34139 1726867645.25061: Calling all_inventory to load vars for managed_node1 34139 1726867645.25064: Calling groups_inventory to load vars for managed_node1 34139 1726867645.25066: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.25073: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.25076: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.25080: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.25193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.25322: done with get_vars() 34139 1726867645.25330: done getting variables 34139 1726867645.25368: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:27:25 -0400 (0:00:00.014) 0:00:03.999 ****** 34139 1726867645.25391: entering _queue_task() for managed_node1/copy 34139 1726867645.25570: worker is 1 (out of 1 available) 34139 1726867645.25586: exiting _queue_task() for managed_node1/copy 34139 1726867645.25599: done queuing things up, now waiting for results queue to drain 34139 1726867645.25601: waiting for pending results... 34139 1726867645.25763: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34139 1726867645.25839: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000bf 34139 1726867645.25856: variable 'ansible_search_path' from source: unknown 34139 1726867645.25859: variable 'ansible_search_path' from source: unknown 34139 1726867645.25883: calling self._execute() 34139 1726867645.25946: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.25949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.25961: variable 'omit' from source: magic vars 34139 1726867645.26218: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.26227: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.26303: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.26307: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.26314: when evaluation is False, skipping this task 34139 1726867645.26317: _execute() done 34139 1726867645.26320: dumping result to json 34139 1726867645.26323: done dumping result, returning 34139 1726867645.26331: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-c103-b8fd-0000000000bf] 34139 1726867645.26335: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000bf 34139 1726867645.26420: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000bf 34139 1726867645.26423: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.26467: no more pending results, returning what we have 34139 1726867645.26469: results queue empty 34139 1726867645.26470: checking for any_errors_fatal 34139 1726867645.26474: done checking for any_errors_fatal 34139 1726867645.26475: checking for max_fail_percentage 34139 1726867645.26476: done checking for max_fail_percentage 34139 1726867645.26479: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.26479: done checking to see if all hosts have failed 34139 1726867645.26480: getting the remaining hosts for this loop 34139 1726867645.26481: done getting the remaining hosts for this loop 34139 1726867645.26484: getting the next task for host managed_node1 34139 1726867645.26489: done getting next task for host managed_node1 34139 1726867645.26492: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34139 1726867645.26494: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.26510: getting variables 34139 1726867645.26511: in VariableManager get_vars() 34139 1726867645.26547: Calling all_inventory to load vars for managed_node1 34139 1726867645.26549: Calling groups_inventory to load vars for managed_node1 34139 1726867645.26552: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.26558: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.26560: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.26561: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.26855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.26975: done with get_vars() 34139 1726867645.26983: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:27:25 -0400 (0:00:00.016) 0:00:04.016 ****** 34139 1726867645.27037: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34139 1726867645.27208: worker is 1 (out of 1 available) 34139 1726867645.27221: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34139 1726867645.27233: done queuing things up, now waiting for results queue to drain 34139 1726867645.27234: waiting for pending results... 34139 1726867645.27395: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34139 1726867645.27474: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000c0 34139 1726867645.27486: variable 'ansible_search_path' from source: unknown 34139 1726867645.27489: variable 'ansible_search_path' from source: unknown 34139 1726867645.27518: calling self._execute() 34139 1726867645.27573: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.27583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.27592: variable 'omit' from source: magic vars 34139 1726867645.27855: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.27864: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.27945: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.27949: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.27951: when evaluation is False, skipping this task 34139 1726867645.27954: _execute() done 34139 1726867645.27956: dumping result to json 34139 1726867645.27960: done dumping result, returning 34139 1726867645.27968: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-c103-b8fd-0000000000c0] 34139 1726867645.27971: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000c0 34139 1726867645.28062: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000c0 34139 1726867645.28065: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.28112: no more pending results, returning what we have 34139 1726867645.28115: results queue empty 34139 1726867645.28116: checking for any_errors_fatal 34139 1726867645.28120: done checking for any_errors_fatal 34139 1726867645.28121: checking for max_fail_percentage 34139 1726867645.28123: done checking for max_fail_percentage 34139 1726867645.28124: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.28125: done checking to see if all hosts have failed 34139 1726867645.28125: getting the remaining hosts for this loop 34139 1726867645.28126: done getting the remaining hosts for this loop 34139 1726867645.28129: getting the next task for host managed_node1 34139 1726867645.28133: done getting next task for host managed_node1 34139 1726867645.28136: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34139 1726867645.28138: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.28153: getting variables 34139 1726867645.28155: in VariableManager get_vars() 34139 1726867645.28204: Calling all_inventory to load vars for managed_node1 34139 1726867645.28206: Calling groups_inventory to load vars for managed_node1 34139 1726867645.28208: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.28214: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.28216: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.28217: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.28330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.28456: done with get_vars() 34139 1726867645.28463: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:27:25 -0400 (0:00:00.014) 0:00:04.031 ****** 34139 1726867645.28521: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34139 1726867645.28694: worker is 1 (out of 1 available) 34139 1726867645.28707: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34139 1726867645.28718: done queuing things up, now waiting for results queue to drain 34139 1726867645.28720: waiting for pending results... 34139 1726867645.28882: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 34139 1726867645.28961: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000c1 34139 1726867645.28973: variable 'ansible_search_path' from source: unknown 34139 1726867645.28976: variable 'ansible_search_path' from source: unknown 34139 1726867645.29003: calling self._execute() 34139 1726867645.29064: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.29067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.29076: variable 'omit' from source: magic vars 34139 1726867645.29341: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.29350: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.29430: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.29434: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.29437: when evaluation is False, skipping this task 34139 1726867645.29439: _execute() done 34139 1726867645.29442: dumping result to json 34139 1726867645.29447: done dumping result, returning 34139 1726867645.29453: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-c103-b8fd-0000000000c1] 34139 1726867645.29457: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000c1 34139 1726867645.29540: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000c1 34139 1726867645.29543: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.29596: no more pending results, returning what we have 34139 1726867645.29599: results queue empty 34139 1726867645.29600: checking for any_errors_fatal 34139 1726867645.29605: done checking for any_errors_fatal 34139 1726867645.29606: checking for max_fail_percentage 34139 1726867645.29607: done checking for max_fail_percentage 34139 1726867645.29608: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.29608: done checking to see if all hosts have failed 34139 1726867645.29609: getting the remaining hosts for this loop 34139 1726867645.29610: done getting the remaining hosts for this loop 34139 1726867645.29613: getting the next task for host managed_node1 34139 1726867645.29618: done getting next task for host managed_node1 34139 1726867645.29620: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34139 1726867645.29623: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.29638: getting variables 34139 1726867645.29639: in VariableManager get_vars() 34139 1726867645.29682: Calling all_inventory to load vars for managed_node1 34139 1726867645.29684: Calling groups_inventory to load vars for managed_node1 34139 1726867645.29686: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.29691: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.29693: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.29695: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.29840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.29965: done with get_vars() 34139 1726867645.29973: done getting variables 34139 1726867645.30014: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:27:25 -0400 (0:00:00.015) 0:00:04.046 ****** 34139 1726867645.30034: entering _queue_task() for managed_node1/debug 34139 1726867645.30207: worker is 1 (out of 1 available) 34139 1726867645.30219: exiting _queue_task() for managed_node1/debug 34139 1726867645.30231: done queuing things up, now waiting for results queue to drain 34139 1726867645.30232: waiting for pending results... 34139 1726867645.30390: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34139 1726867645.30466: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000c2 34139 1726867645.30478: variable 'ansible_search_path' from source: unknown 34139 1726867645.30482: variable 'ansible_search_path' from source: unknown 34139 1726867645.30508: calling self._execute() 34139 1726867645.30571: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.30574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.30586: variable 'omit' from source: magic vars 34139 1726867645.30845: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.30855: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.30934: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.30938: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.30941: when evaluation is False, skipping this task 34139 1726867645.30944: _execute() done 34139 1726867645.30947: dumping result to json 34139 1726867645.30949: done dumping result, returning 34139 1726867645.30957: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-c103-b8fd-0000000000c2] 34139 1726867645.30961: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000c2 34139 1726867645.31042: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000c2 34139 1726867645.31045: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867645.31088: no more pending results, returning what we have 34139 1726867645.31091: results queue empty 34139 1726867645.31092: checking for any_errors_fatal 34139 1726867645.31095: done checking for any_errors_fatal 34139 1726867645.31096: checking for max_fail_percentage 34139 1726867645.31097: done checking for max_fail_percentage 34139 1726867645.31098: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.31099: done checking to see if all hosts have failed 34139 1726867645.31099: getting the remaining hosts for this loop 34139 1726867645.31100: done getting the remaining hosts for this loop 34139 1726867645.31103: getting the next task for host managed_node1 34139 1726867645.31108: done getting next task for host managed_node1 34139 1726867645.31112: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34139 1726867645.31114: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.31129: getting variables 34139 1726867645.31130: in VariableManager get_vars() 34139 1726867645.31165: Calling all_inventory to load vars for managed_node1 34139 1726867645.31167: Calling groups_inventory to load vars for managed_node1 34139 1726867645.31169: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.31175: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.31179: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.31181: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.31291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.31419: done with get_vars() 34139 1726867645.31426: done getting variables 34139 1726867645.31462: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:27:25 -0400 (0:00:00.014) 0:00:04.060 ****** 34139 1726867645.31484: entering _queue_task() for managed_node1/debug 34139 1726867645.31653: worker is 1 (out of 1 available) 34139 1726867645.31668: exiting _queue_task() for managed_node1/debug 34139 1726867645.31680: done queuing things up, now waiting for results queue to drain 34139 1726867645.31682: waiting for pending results... 34139 1726867645.31835: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34139 1726867645.31905: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000c3 34139 1726867645.31917: variable 'ansible_search_path' from source: unknown 34139 1726867645.31922: variable 'ansible_search_path' from source: unknown 34139 1726867645.31949: calling self._execute() 34139 1726867645.32007: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.32015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.32024: variable 'omit' from source: magic vars 34139 1726867645.32325: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.32333: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.32414: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.32418: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.32420: when evaluation is False, skipping this task 34139 1726867645.32425: _execute() done 34139 1726867645.32427: dumping result to json 34139 1726867645.32430: done dumping result, returning 34139 1726867645.32437: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-c103-b8fd-0000000000c3] 34139 1726867645.32441: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000c3 34139 1726867645.32537: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000c3 34139 1726867645.32540: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867645.32652: no more pending results, returning what we have 34139 1726867645.32657: results queue empty 34139 1726867645.32658: checking for any_errors_fatal 34139 1726867645.32666: done checking for any_errors_fatal 34139 1726867645.32667: checking for max_fail_percentage 34139 1726867645.32669: done checking for max_fail_percentage 34139 1726867645.32670: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.32670: done checking to see if all hosts have failed 34139 1726867645.32671: getting the remaining hosts for this loop 34139 1726867645.32672: done getting the remaining hosts for this loop 34139 1726867645.32708: getting the next task for host managed_node1 34139 1726867645.32716: done getting next task for host managed_node1 34139 1726867645.32721: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34139 1726867645.32725: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.32744: getting variables 34139 1726867645.32746: in VariableManager get_vars() 34139 1726867645.32787: Calling all_inventory to load vars for managed_node1 34139 1726867645.32789: Calling groups_inventory to load vars for managed_node1 34139 1726867645.32792: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.32800: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.32803: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.32806: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.32956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.33082: done with get_vars() 34139 1726867645.33089: done getting variables 34139 1726867645.33131: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:27:25 -0400 (0:00:00.016) 0:00:04.077 ****** 34139 1726867645.33150: entering _queue_task() for managed_node1/debug 34139 1726867645.33311: worker is 1 (out of 1 available) 34139 1726867645.33323: exiting _queue_task() for managed_node1/debug 34139 1726867645.33332: done queuing things up, now waiting for results queue to drain 34139 1726867645.33334: waiting for pending results... 34139 1726867645.33499: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34139 1726867645.33574: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000c4 34139 1726867645.33586: variable 'ansible_search_path' from source: unknown 34139 1726867645.33590: variable 'ansible_search_path' from source: unknown 34139 1726867645.33616: calling self._execute() 34139 1726867645.33680: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.33684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.33693: variable 'omit' from source: magic vars 34139 1726867645.33952: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.33961: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.34041: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.34045: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.34047: when evaluation is False, skipping this task 34139 1726867645.34050: _execute() done 34139 1726867645.34053: dumping result to json 34139 1726867645.34058: done dumping result, returning 34139 1726867645.34065: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-c103-b8fd-0000000000c4] 34139 1726867645.34070: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000c4 34139 1726867645.34152: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000c4 34139 1726867645.34155: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867645.34196: no more pending results, returning what we have 34139 1726867645.34199: results queue empty 34139 1726867645.34200: checking for any_errors_fatal 34139 1726867645.34203: done checking for any_errors_fatal 34139 1726867645.34203: checking for max_fail_percentage 34139 1726867645.34204: done checking for max_fail_percentage 34139 1726867645.34205: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.34206: done checking to see if all hosts have failed 34139 1726867645.34209: getting the remaining hosts for this loop 34139 1726867645.34210: done getting the remaining hosts for this loop 34139 1726867645.34213: getting the next task for host managed_node1 34139 1726867645.34217: done getting next task for host managed_node1 34139 1726867645.34220: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34139 1726867645.34223: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.34237: getting variables 34139 1726867645.34238: in VariableManager get_vars() 34139 1726867645.34272: Calling all_inventory to load vars for managed_node1 34139 1726867645.34274: Calling groups_inventory to load vars for managed_node1 34139 1726867645.34276: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.34284: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.34286: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.34287: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.34397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.34565: done with get_vars() 34139 1726867645.34574: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:27:25 -0400 (0:00:00.015) 0:00:04.092 ****** 34139 1726867645.34658: entering _queue_task() for managed_node1/ping 34139 1726867645.34868: worker is 1 (out of 1 available) 34139 1726867645.34880: exiting _queue_task() for managed_node1/ping 34139 1726867645.34891: done queuing things up, now waiting for results queue to drain 34139 1726867645.34892: waiting for pending results... 34139 1726867645.35295: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 34139 1726867645.35299: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000c5 34139 1726867645.35302: variable 'ansible_search_path' from source: unknown 34139 1726867645.35304: variable 'ansible_search_path' from source: unknown 34139 1726867645.35306: calling self._execute() 34139 1726867645.35380: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.35395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.35411: variable 'omit' from source: magic vars 34139 1726867645.35768: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.35787: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.35906: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.35934: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.35937: when evaluation is False, skipping this task 34139 1726867645.35940: _execute() done 34139 1726867645.35942: dumping result to json 34139 1726867645.36082: done dumping result, returning 34139 1726867645.36086: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-c103-b8fd-0000000000c5] 34139 1726867645.36089: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000c5 34139 1726867645.36149: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000c5 34139 1726867645.36152: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.36193: no more pending results, returning what we have 34139 1726867645.36196: results queue empty 34139 1726867645.36197: checking for any_errors_fatal 34139 1726867645.36202: done checking for any_errors_fatal 34139 1726867645.36203: checking for max_fail_percentage 34139 1726867645.36204: done checking for max_fail_percentage 34139 1726867645.36205: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.36206: done checking to see if all hosts have failed 34139 1726867645.36209: getting the remaining hosts for this loop 34139 1726867645.36210: done getting the remaining hosts for this loop 34139 1726867645.36214: getting the next task for host managed_node1 34139 1726867645.36221: done getting next task for host managed_node1 34139 1726867645.36223: ^ task is: TASK: meta (role_complete) 34139 1726867645.36225: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867645.36243: getting variables 34139 1726867645.36245: in VariableManager get_vars() 34139 1726867645.36289: Calling all_inventory to load vars for managed_node1 34139 1726867645.36291: Calling groups_inventory to load vars for managed_node1 34139 1726867645.36294: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.36303: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.36306: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.36311: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.36579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.36794: done with get_vars() 34139 1726867645.36803: done getting variables 34139 1726867645.36876: done queuing things up, now waiting for results queue to drain 34139 1726867645.36879: results queue empty 34139 1726867645.36879: checking for any_errors_fatal 34139 1726867645.36881: done checking for any_errors_fatal 34139 1726867645.36882: checking for max_fail_percentage 34139 1726867645.36883: done checking for max_fail_percentage 34139 1726867645.36884: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.36884: done checking to see if all hosts have failed 34139 1726867645.36885: getting the remaining hosts for this loop 34139 1726867645.36886: done getting the remaining hosts for this loop 34139 1726867645.36888: getting the next task for host managed_node1 34139 1726867645.36894: done getting next task for host managed_node1 34139 1726867645.36897: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34139 1726867645.36899: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.36911: getting variables 34139 1726867645.36912: in VariableManager get_vars() 34139 1726867645.36929: Calling all_inventory to load vars for managed_node1 34139 1726867645.36931: Calling groups_inventory to load vars for managed_node1 34139 1726867645.36933: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.36937: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.36940: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.36942: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.37118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.37331: done with get_vars() 34139 1726867645.37340: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:27:25 -0400 (0:00:00.027) 0:00:04.120 ****** 34139 1726867645.37413: entering _queue_task() for managed_node1/include_tasks 34139 1726867645.37635: worker is 1 (out of 1 available) 34139 1726867645.37646: exiting _queue_task() for managed_node1/include_tasks 34139 1726867645.37658: done queuing things up, now waiting for results queue to drain 34139 1726867645.37660: waiting for pending results... 34139 1726867645.38091: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34139 1726867645.38102: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000fd 34139 1726867645.38106: variable 'ansible_search_path' from source: unknown 34139 1726867645.38111: variable 'ansible_search_path' from source: unknown 34139 1726867645.38121: calling self._execute() 34139 1726867645.38190: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.38194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.38203: variable 'omit' from source: magic vars 34139 1726867645.38462: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.38471: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.38552: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.38556: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.38559: when evaluation is False, skipping this task 34139 1726867645.38563: _execute() done 34139 1726867645.38565: dumping result to json 34139 1726867645.38569: done dumping result, returning 34139 1726867645.38575: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-c103-b8fd-0000000000fd] 34139 1726867645.38585: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000fd 34139 1726867645.38668: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000fd 34139 1726867645.38671: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.38719: no more pending results, returning what we have 34139 1726867645.38722: results queue empty 34139 1726867645.38722: checking for any_errors_fatal 34139 1726867645.38724: done checking for any_errors_fatal 34139 1726867645.38724: checking for max_fail_percentage 34139 1726867645.38726: done checking for max_fail_percentage 34139 1726867645.38726: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.38727: done checking to see if all hosts have failed 34139 1726867645.38728: getting the remaining hosts for this loop 34139 1726867645.38729: done getting the remaining hosts for this loop 34139 1726867645.38732: getting the next task for host managed_node1 34139 1726867645.38737: done getting next task for host managed_node1 34139 1726867645.38740: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34139 1726867645.38744: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.38758: getting variables 34139 1726867645.38759: in VariableManager get_vars() 34139 1726867645.38798: Calling all_inventory to load vars for managed_node1 34139 1726867645.38800: Calling groups_inventory to load vars for managed_node1 34139 1726867645.38802: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.38809: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.38811: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.38813: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.38927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.39057: done with get_vars() 34139 1726867645.39063: done getting variables 34139 1726867645.39101: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:27:25 -0400 (0:00:00.017) 0:00:04.137 ****** 34139 1726867645.39126: entering _queue_task() for managed_node1/debug 34139 1726867645.39366: worker is 1 (out of 1 available) 34139 1726867645.39382: exiting _queue_task() for managed_node1/debug 34139 1726867645.39395: done queuing things up, now waiting for results queue to drain 34139 1726867645.39397: waiting for pending results... 34139 1726867645.39640: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 34139 1726867645.39728: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000fe 34139 1726867645.39739: variable 'ansible_search_path' from source: unknown 34139 1726867645.39742: variable 'ansible_search_path' from source: unknown 34139 1726867645.39770: calling self._execute() 34139 1726867645.39831: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.39835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.39844: variable 'omit' from source: magic vars 34139 1726867645.40154: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.40382: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.40386: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.40388: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.40391: when evaluation is False, skipping this task 34139 1726867645.40394: _execute() done 34139 1726867645.40397: dumping result to json 34139 1726867645.40399: done dumping result, returning 34139 1726867645.40406: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-c103-b8fd-0000000000fe] 34139 1726867645.40411: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000fe 34139 1726867645.40476: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000fe 34139 1726867645.40482: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867645.40553: no more pending results, returning what we have 34139 1726867645.40556: results queue empty 34139 1726867645.40557: checking for any_errors_fatal 34139 1726867645.40561: done checking for any_errors_fatal 34139 1726867645.40562: checking for max_fail_percentage 34139 1726867645.40563: done checking for max_fail_percentage 34139 1726867645.40564: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.40565: done checking to see if all hosts have failed 34139 1726867645.40566: getting the remaining hosts for this loop 34139 1726867645.40567: done getting the remaining hosts for this loop 34139 1726867645.40569: getting the next task for host managed_node1 34139 1726867645.40575: done getting next task for host managed_node1 34139 1726867645.40629: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34139 1726867645.40634: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.40649: getting variables 34139 1726867645.40650: in VariableManager get_vars() 34139 1726867645.40688: Calling all_inventory to load vars for managed_node1 34139 1726867645.40691: Calling groups_inventory to load vars for managed_node1 34139 1726867645.40693: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.40701: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.40703: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.40706: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.40926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.41149: done with get_vars() 34139 1726867645.41158: done getting variables 34139 1726867645.41212: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:27:25 -0400 (0:00:00.021) 0:00:04.158 ****** 34139 1726867645.41246: entering _queue_task() for managed_node1/fail 34139 1726867645.41464: worker is 1 (out of 1 available) 34139 1726867645.41476: exiting _queue_task() for managed_node1/fail 34139 1726867645.41489: done queuing things up, now waiting for results queue to drain 34139 1726867645.41491: waiting for pending results... 34139 1726867645.41742: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34139 1726867645.41879: in run() - task 0affcac9-a3a5-c103-b8fd-0000000000ff 34139 1726867645.41899: variable 'ansible_search_path' from source: unknown 34139 1726867645.41910: variable 'ansible_search_path' from source: unknown 34139 1726867645.41949: calling self._execute() 34139 1726867645.42037: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.42049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.42064: variable 'omit' from source: magic vars 34139 1726867645.42423: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.42439: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.42556: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.42566: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.42573: when evaluation is False, skipping this task 34139 1726867645.42582: _execute() done 34139 1726867645.42588: dumping result to json 34139 1726867645.42595: done dumping result, returning 34139 1726867645.42605: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-c103-b8fd-0000000000ff] 34139 1726867645.42682: sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000ff skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.42938: no more pending results, returning what we have 34139 1726867645.42941: results queue empty 34139 1726867645.42942: checking for any_errors_fatal 34139 1726867645.42947: done checking for any_errors_fatal 34139 1726867645.42948: checking for max_fail_percentage 34139 1726867645.42950: done checking for max_fail_percentage 34139 1726867645.42950: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.42951: done checking to see if all hosts have failed 34139 1726867645.42952: getting the remaining hosts for this loop 34139 1726867645.42953: done getting the remaining hosts for this loop 34139 1726867645.42956: getting the next task for host managed_node1 34139 1726867645.42961: done getting next task for host managed_node1 34139 1726867645.42964: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34139 1726867645.42968: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.42986: getting variables 34139 1726867645.42988: in VariableManager get_vars() 34139 1726867645.43027: Calling all_inventory to load vars for managed_node1 34139 1726867645.43029: Calling groups_inventory to load vars for managed_node1 34139 1726867645.43032: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.43039: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.43042: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.43044: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.43210: done sending task result for task 0affcac9-a3a5-c103-b8fd-0000000000ff 34139 1726867645.43214: WORKER PROCESS EXITING 34139 1726867645.43234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.43464: done with get_vars() 34139 1726867645.43474: done getting variables 34139 1726867645.43530: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:27:25 -0400 (0:00:00.023) 0:00:04.181 ****** 34139 1726867645.43559: entering _queue_task() for managed_node1/fail 34139 1726867645.43767: worker is 1 (out of 1 available) 34139 1726867645.43781: exiting _queue_task() for managed_node1/fail 34139 1726867645.43791: done queuing things up, now waiting for results queue to drain 34139 1726867645.43793: waiting for pending results... 34139 1726867645.44041: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34139 1726867645.44175: in run() - task 0affcac9-a3a5-c103-b8fd-000000000100 34139 1726867645.44198: variable 'ansible_search_path' from source: unknown 34139 1726867645.44209: variable 'ansible_search_path' from source: unknown 34139 1726867645.44246: calling self._execute() 34139 1726867645.44327: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.44338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.44351: variable 'omit' from source: magic vars 34139 1726867645.44693: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.44713: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.44832: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.44845: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.44853: when evaluation is False, skipping this task 34139 1726867645.44862: _execute() done 34139 1726867645.44868: dumping result to json 34139 1726867645.44875: done dumping result, returning 34139 1726867645.44888: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-c103-b8fd-000000000100] 34139 1726867645.44948: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000100 34139 1726867645.45017: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000100 34139 1726867645.45020: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.45097: no more pending results, returning what we have 34139 1726867645.45100: results queue empty 34139 1726867645.45101: checking for any_errors_fatal 34139 1726867645.45110: done checking for any_errors_fatal 34139 1726867645.45111: checking for max_fail_percentage 34139 1726867645.45112: done checking for max_fail_percentage 34139 1726867645.45113: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.45114: done checking to see if all hosts have failed 34139 1726867645.45115: getting the remaining hosts for this loop 34139 1726867645.45116: done getting the remaining hosts for this loop 34139 1726867645.45119: getting the next task for host managed_node1 34139 1726867645.45127: done getting next task for host managed_node1 34139 1726867645.45130: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34139 1726867645.45134: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.45152: getting variables 34139 1726867645.45154: in VariableManager get_vars() 34139 1726867645.45202: Calling all_inventory to load vars for managed_node1 34139 1726867645.45205: Calling groups_inventory to load vars for managed_node1 34139 1726867645.45210: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.45221: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.45224: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.45226: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.45584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.45811: done with get_vars() 34139 1726867645.45821: done getting variables 34139 1726867645.45871: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:27:25 -0400 (0:00:00.023) 0:00:04.205 ****** 34139 1726867645.45902: entering _queue_task() for managed_node1/fail 34139 1726867645.46106: worker is 1 (out of 1 available) 34139 1726867645.46120: exiting _queue_task() for managed_node1/fail 34139 1726867645.46131: done queuing things up, now waiting for results queue to drain 34139 1726867645.46132: waiting for pending results... 34139 1726867645.46534: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34139 1726867645.46539: in run() - task 0affcac9-a3a5-c103-b8fd-000000000101 34139 1726867645.46541: variable 'ansible_search_path' from source: unknown 34139 1726867645.46544: variable 'ansible_search_path' from source: unknown 34139 1726867645.46673: calling self._execute() 34139 1726867645.46845: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.46857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.46889: variable 'omit' from source: magic vars 34139 1726867645.47725: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.47729: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.47802: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.47816: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.47823: when evaluation is False, skipping this task 34139 1726867645.47833: _execute() done 34139 1726867645.47863: dumping result to json 34139 1726867645.47871: done dumping result, returning 34139 1726867645.47887: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-c103-b8fd-000000000101] 34139 1726867645.47919: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000101 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.48094: no more pending results, returning what we have 34139 1726867645.48098: results queue empty 34139 1726867645.48098: checking for any_errors_fatal 34139 1726867645.48104: done checking for any_errors_fatal 34139 1726867645.48105: checking for max_fail_percentage 34139 1726867645.48109: done checking for max_fail_percentage 34139 1726867645.48110: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.48111: done checking to see if all hosts have failed 34139 1726867645.48112: getting the remaining hosts for this loop 34139 1726867645.48113: done getting the remaining hosts for this loop 34139 1726867645.48117: getting the next task for host managed_node1 34139 1726867645.48124: done getting next task for host managed_node1 34139 1726867645.48127: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34139 1726867645.48132: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.48150: getting variables 34139 1726867645.48152: in VariableManager get_vars() 34139 1726867645.48197: Calling all_inventory to load vars for managed_node1 34139 1726867645.48200: Calling groups_inventory to load vars for managed_node1 34139 1726867645.48202: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.48216: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.48219: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.48223: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.48579: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000101 34139 1726867645.48583: WORKER PROCESS EXITING 34139 1726867645.48604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.48832: done with get_vars() 34139 1726867645.48842: done getting variables 34139 1726867645.48895: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:27:25 -0400 (0:00:00.030) 0:00:04.235 ****** 34139 1726867645.48926: entering _queue_task() for managed_node1/dnf 34139 1726867645.49141: worker is 1 (out of 1 available) 34139 1726867645.49154: exiting _queue_task() for managed_node1/dnf 34139 1726867645.49165: done queuing things up, now waiting for results queue to drain 34139 1726867645.49167: waiting for pending results... 34139 1726867645.49430: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34139 1726867645.49557: in run() - task 0affcac9-a3a5-c103-b8fd-000000000102 34139 1726867645.49572: variable 'ansible_search_path' from source: unknown 34139 1726867645.49581: variable 'ansible_search_path' from source: unknown 34139 1726867645.49621: calling self._execute() 34139 1726867645.49695: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.49711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.49725: variable 'omit' from source: magic vars 34139 1726867645.50147: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.50165: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.50282: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.50292: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.50299: when evaluation is False, skipping this task 34139 1726867645.50306: _execute() done 34139 1726867645.50317: dumping result to json 34139 1726867645.50324: done dumping result, returning 34139 1726867645.50334: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-c103-b8fd-000000000102] 34139 1726867645.50343: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000102 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.50618: no more pending results, returning what we have 34139 1726867645.50621: results queue empty 34139 1726867645.50622: checking for any_errors_fatal 34139 1726867645.50627: done checking for any_errors_fatal 34139 1726867645.50628: checking for max_fail_percentage 34139 1726867645.50629: done checking for max_fail_percentage 34139 1726867645.50630: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.50631: done checking to see if all hosts have failed 34139 1726867645.50632: getting the remaining hosts for this loop 34139 1726867645.50633: done getting the remaining hosts for this loop 34139 1726867645.50636: getting the next task for host managed_node1 34139 1726867645.50641: done getting next task for host managed_node1 34139 1726867645.50644: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34139 1726867645.50648: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.50663: getting variables 34139 1726867645.50665: in VariableManager get_vars() 34139 1726867645.50704: Calling all_inventory to load vars for managed_node1 34139 1726867645.50709: Calling groups_inventory to load vars for managed_node1 34139 1726867645.50712: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.50720: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.50722: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.50725: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.50985: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000102 34139 1726867645.50988: WORKER PROCESS EXITING 34139 1726867645.51011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.51234: done with get_vars() 34139 1726867645.51243: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34139 1726867645.51315: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:27:25 -0400 (0:00:00.024) 0:00:04.259 ****** 34139 1726867645.51342: entering _queue_task() for managed_node1/yum 34139 1726867645.51549: worker is 1 (out of 1 available) 34139 1726867645.51560: exiting _queue_task() for managed_node1/yum 34139 1726867645.51571: done queuing things up, now waiting for results queue to drain 34139 1726867645.51572: waiting for pending results... 34139 1726867645.51826: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34139 1726867645.51957: in run() - task 0affcac9-a3a5-c103-b8fd-000000000103 34139 1726867645.51975: variable 'ansible_search_path' from source: unknown 34139 1726867645.51986: variable 'ansible_search_path' from source: unknown 34139 1726867645.52028: calling self._execute() 34139 1726867645.52109: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.52123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.52137: variable 'omit' from source: magic vars 34139 1726867645.52496: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.52516: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.52633: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.52643: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.52651: when evaluation is False, skipping this task 34139 1726867645.52662: _execute() done 34139 1726867645.52668: dumping result to json 34139 1726867645.52675: done dumping result, returning 34139 1726867645.52688: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-c103-b8fd-000000000103] 34139 1726867645.52696: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000103 34139 1726867645.52883: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000103 34139 1726867645.52887: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.52941: no more pending results, returning what we have 34139 1726867645.52945: results queue empty 34139 1726867645.52946: checking for any_errors_fatal 34139 1726867645.52950: done checking for any_errors_fatal 34139 1726867645.52951: checking for max_fail_percentage 34139 1726867645.52953: done checking for max_fail_percentage 34139 1726867645.52953: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.52954: done checking to see if all hosts have failed 34139 1726867645.52955: getting the remaining hosts for this loop 34139 1726867645.52957: done getting the remaining hosts for this loop 34139 1726867645.52960: getting the next task for host managed_node1 34139 1726867645.52968: done getting next task for host managed_node1 34139 1726867645.52971: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34139 1726867645.52976: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.52996: getting variables 34139 1726867645.52998: in VariableManager get_vars() 34139 1726867645.53046: Calling all_inventory to load vars for managed_node1 34139 1726867645.53049: Calling groups_inventory to load vars for managed_node1 34139 1726867645.53051: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.53062: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.53064: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.53067: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.53374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.53604: done with get_vars() 34139 1726867645.53615: done getting variables 34139 1726867645.53668: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:27:25 -0400 (0:00:00.023) 0:00:04.283 ****** 34139 1726867645.53700: entering _queue_task() for managed_node1/fail 34139 1726867645.53913: worker is 1 (out of 1 available) 34139 1726867645.53925: exiting _queue_task() for managed_node1/fail 34139 1726867645.53935: done queuing things up, now waiting for results queue to drain 34139 1726867645.53937: waiting for pending results... 34139 1726867645.54190: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34139 1726867645.54318: in run() - task 0affcac9-a3a5-c103-b8fd-000000000104 34139 1726867645.54336: variable 'ansible_search_path' from source: unknown 34139 1726867645.54343: variable 'ansible_search_path' from source: unknown 34139 1726867645.54380: calling self._execute() 34139 1726867645.54500: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.54504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.54509: variable 'omit' from source: magic vars 34139 1726867645.54901: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.54920: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.55042: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.55053: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.55148: when evaluation is False, skipping this task 34139 1726867645.55151: _execute() done 34139 1726867645.55154: dumping result to json 34139 1726867645.55156: done dumping result, returning 34139 1726867645.55159: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-c103-b8fd-000000000104] 34139 1726867645.55161: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000104 34139 1726867645.55231: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000104 34139 1726867645.55234: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.55282: no more pending results, returning what we have 34139 1726867645.55285: results queue empty 34139 1726867645.55286: checking for any_errors_fatal 34139 1726867645.55291: done checking for any_errors_fatal 34139 1726867645.55292: checking for max_fail_percentage 34139 1726867645.55294: done checking for max_fail_percentage 34139 1726867645.55295: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.55296: done checking to see if all hosts have failed 34139 1726867645.55296: getting the remaining hosts for this loop 34139 1726867645.55298: done getting the remaining hosts for this loop 34139 1726867645.55301: getting the next task for host managed_node1 34139 1726867645.55310: done getting next task for host managed_node1 34139 1726867645.55313: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34139 1726867645.55317: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.55334: getting variables 34139 1726867645.55336: in VariableManager get_vars() 34139 1726867645.55381: Calling all_inventory to load vars for managed_node1 34139 1726867645.55384: Calling groups_inventory to load vars for managed_node1 34139 1726867645.55387: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.55398: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.55401: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.55404: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.55765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.55990: done with get_vars() 34139 1726867645.55999: done getting variables 34139 1726867645.56057: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:27:25 -0400 (0:00:00.023) 0:00:04.306 ****** 34139 1726867645.56088: entering _queue_task() for managed_node1/package 34139 1726867645.56294: worker is 1 (out of 1 available) 34139 1726867645.56306: exiting _queue_task() for managed_node1/package 34139 1726867645.56320: done queuing things up, now waiting for results queue to drain 34139 1726867645.56322: waiting for pending results... 34139 1726867645.56689: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 34139 1726867645.56711: in run() - task 0affcac9-a3a5-c103-b8fd-000000000105 34139 1726867645.56728: variable 'ansible_search_path' from source: unknown 34139 1726867645.56736: variable 'ansible_search_path' from source: unknown 34139 1726867645.56769: calling self._execute() 34139 1726867645.56850: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.56862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.56876: variable 'omit' from source: magic vars 34139 1726867645.57225: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.57241: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.57360: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.57370: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.57382: when evaluation is False, skipping this task 34139 1726867645.57390: _execute() done 34139 1726867645.57396: dumping result to json 34139 1726867645.57403: done dumping result, returning 34139 1726867645.57415: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-c103-b8fd-000000000105] 34139 1726867645.57425: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000105 34139 1726867645.57612: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000105 34139 1726867645.57616: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.57692: no more pending results, returning what we have 34139 1726867645.57696: results queue empty 34139 1726867645.57696: checking for any_errors_fatal 34139 1726867645.57701: done checking for any_errors_fatal 34139 1726867645.57701: checking for max_fail_percentage 34139 1726867645.57703: done checking for max_fail_percentage 34139 1726867645.57704: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.57704: done checking to see if all hosts have failed 34139 1726867645.57705: getting the remaining hosts for this loop 34139 1726867645.57706: done getting the remaining hosts for this loop 34139 1726867645.57712: getting the next task for host managed_node1 34139 1726867645.57718: done getting next task for host managed_node1 34139 1726867645.57722: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34139 1726867645.57725: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.57740: getting variables 34139 1726867645.57742: in VariableManager get_vars() 34139 1726867645.57784: Calling all_inventory to load vars for managed_node1 34139 1726867645.57788: Calling groups_inventory to load vars for managed_node1 34139 1726867645.57790: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.57799: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.57802: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.57804: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.58054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.58311: done with get_vars() 34139 1726867645.58320: done getting variables 34139 1726867645.58369: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:27:25 -0400 (0:00:00.023) 0:00:04.330 ****** 34139 1726867645.58399: entering _queue_task() for managed_node1/package 34139 1726867645.58843: worker is 1 (out of 1 available) 34139 1726867645.58857: exiting _queue_task() for managed_node1/package 34139 1726867645.58868: done queuing things up, now waiting for results queue to drain 34139 1726867645.58870: waiting for pending results... 34139 1726867645.59749: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34139 1726867645.60175: in run() - task 0affcac9-a3a5-c103-b8fd-000000000106 34139 1726867645.60180: variable 'ansible_search_path' from source: unknown 34139 1726867645.60183: variable 'ansible_search_path' from source: unknown 34139 1726867645.60590: calling self._execute() 34139 1726867645.60594: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.60596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.60598: variable 'omit' from source: magic vars 34139 1726867645.61883: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.61887: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.61889: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.61891: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.61894: when evaluation is False, skipping this task 34139 1726867645.61896: _execute() done 34139 1726867645.61898: dumping result to json 34139 1726867645.61900: done dumping result, returning 34139 1726867645.62384: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-c103-b8fd-000000000106] 34139 1726867645.62389: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000106 34139 1726867645.62463: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000106 34139 1726867645.62467: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.62515: no more pending results, returning what we have 34139 1726867645.62518: results queue empty 34139 1726867645.62519: checking for any_errors_fatal 34139 1726867645.62527: done checking for any_errors_fatal 34139 1726867645.62528: checking for max_fail_percentage 34139 1726867645.62529: done checking for max_fail_percentage 34139 1726867645.62530: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.62531: done checking to see if all hosts have failed 34139 1726867645.62531: getting the remaining hosts for this loop 34139 1726867645.62533: done getting the remaining hosts for this loop 34139 1726867645.62536: getting the next task for host managed_node1 34139 1726867645.62545: done getting next task for host managed_node1 34139 1726867645.62548: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34139 1726867645.62552: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.62566: getting variables 34139 1726867645.62568: in VariableManager get_vars() 34139 1726867645.62606: Calling all_inventory to load vars for managed_node1 34139 1726867645.62611: Calling groups_inventory to load vars for managed_node1 34139 1726867645.62614: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.62622: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.62624: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.62627: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.62916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.63414: done with get_vars() 34139 1726867645.63424: done getting variables 34139 1726867645.63521: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:27:25 -0400 (0:00:00.051) 0:00:04.381 ****** 34139 1726867645.63622: entering _queue_task() for managed_node1/package 34139 1726867645.64129: worker is 1 (out of 1 available) 34139 1726867645.64140: exiting _queue_task() for managed_node1/package 34139 1726867645.64150: done queuing things up, now waiting for results queue to drain 34139 1726867645.64152: waiting for pending results... 34139 1726867645.64342: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34139 1726867645.64470: in run() - task 0affcac9-a3a5-c103-b8fd-000000000107 34139 1726867645.64497: variable 'ansible_search_path' from source: unknown 34139 1726867645.64504: variable 'ansible_search_path' from source: unknown 34139 1726867645.64546: calling self._execute() 34139 1726867645.64631: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.64641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.64655: variable 'omit' from source: magic vars 34139 1726867645.65020: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.65037: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.65140: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.65154: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.65161: when evaluation is False, skipping this task 34139 1726867645.65167: _execute() done 34139 1726867645.65172: dumping result to json 34139 1726867645.65181: done dumping result, returning 34139 1726867645.65191: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-c103-b8fd-000000000107] 34139 1726867645.65201: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000107 34139 1726867645.65403: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000107 34139 1726867645.65409: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.65458: no more pending results, returning what we have 34139 1726867645.65462: results queue empty 34139 1726867645.65462: checking for any_errors_fatal 34139 1726867645.65469: done checking for any_errors_fatal 34139 1726867645.65469: checking for max_fail_percentage 34139 1726867645.65471: done checking for max_fail_percentage 34139 1726867645.65472: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.65473: done checking to see if all hosts have failed 34139 1726867645.65473: getting the remaining hosts for this loop 34139 1726867645.65475: done getting the remaining hosts for this loop 34139 1726867645.65481: getting the next task for host managed_node1 34139 1726867645.65489: done getting next task for host managed_node1 34139 1726867645.65492: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34139 1726867645.65496: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.65516: getting variables 34139 1726867645.65518: in VariableManager get_vars() 34139 1726867645.65568: Calling all_inventory to load vars for managed_node1 34139 1726867645.65571: Calling groups_inventory to load vars for managed_node1 34139 1726867645.65573: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.65685: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.65689: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.65692: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.65923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.66483: done with get_vars() 34139 1726867645.66491: done getting variables 34139 1726867645.66539: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:27:25 -0400 (0:00:00.030) 0:00:04.411 ****** 34139 1726867645.66566: entering _queue_task() for managed_node1/service 34139 1726867645.66986: worker is 1 (out of 1 available) 34139 1726867645.66999: exiting _queue_task() for managed_node1/service 34139 1726867645.67013: done queuing things up, now waiting for results queue to drain 34139 1726867645.67015: waiting for pending results... 34139 1726867645.67795: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34139 1726867645.67800: in run() - task 0affcac9-a3a5-c103-b8fd-000000000108 34139 1726867645.67905: variable 'ansible_search_path' from source: unknown 34139 1726867645.67918: variable 'ansible_search_path' from source: unknown 34139 1726867645.67959: calling self._execute() 34139 1726867645.68182: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.68185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.68189: variable 'omit' from source: magic vars 34139 1726867645.68934: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.68951: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.69111: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.69383: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.69387: when evaluation is False, skipping this task 34139 1726867645.69390: _execute() done 34139 1726867645.69392: dumping result to json 34139 1726867645.69395: done dumping result, returning 34139 1726867645.69397: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-c103-b8fd-000000000108] 34139 1726867645.69400: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000108 34139 1726867645.69469: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000108 34139 1726867645.69472: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.69524: no more pending results, returning what we have 34139 1726867645.69528: results queue empty 34139 1726867645.69528: checking for any_errors_fatal 34139 1726867645.69534: done checking for any_errors_fatal 34139 1726867645.69535: checking for max_fail_percentage 34139 1726867645.69537: done checking for max_fail_percentage 34139 1726867645.69538: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.69539: done checking to see if all hosts have failed 34139 1726867645.69540: getting the remaining hosts for this loop 34139 1726867645.69541: done getting the remaining hosts for this loop 34139 1726867645.69545: getting the next task for host managed_node1 34139 1726867645.69552: done getting next task for host managed_node1 34139 1726867645.69555: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34139 1726867645.69560: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.69581: getting variables 34139 1726867645.69584: in VariableManager get_vars() 34139 1726867645.69635: Calling all_inventory to load vars for managed_node1 34139 1726867645.69638: Calling groups_inventory to load vars for managed_node1 34139 1726867645.69641: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.69652: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.69655: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.69658: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.70204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.70539: done with get_vars() 34139 1726867645.70548: done getting variables 34139 1726867645.70910: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:27:25 -0400 (0:00:00.043) 0:00:04.455 ****** 34139 1726867645.70940: entering _queue_task() for managed_node1/service 34139 1726867645.71362: worker is 1 (out of 1 available) 34139 1726867645.71374: exiting _queue_task() for managed_node1/service 34139 1726867645.71388: done queuing things up, now waiting for results queue to drain 34139 1726867645.71390: waiting for pending results... 34139 1726867645.72193: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34139 1726867645.72198: in run() - task 0affcac9-a3a5-c103-b8fd-000000000109 34139 1726867645.72201: variable 'ansible_search_path' from source: unknown 34139 1726867645.72203: variable 'ansible_search_path' from source: unknown 34139 1726867645.72206: calling self._execute() 34139 1726867645.72285: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.72584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.72588: variable 'omit' from source: magic vars 34139 1726867645.73017: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.73382: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.73440: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.73452: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.73459: when evaluation is False, skipping this task 34139 1726867645.73468: _execute() done 34139 1726867645.73475: dumping result to json 34139 1726867645.73486: done dumping result, returning 34139 1726867645.73499: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-c103-b8fd-000000000109] 34139 1726867645.73508: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000109 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34139 1726867645.73654: no more pending results, returning what we have 34139 1726867645.73657: results queue empty 34139 1726867645.73658: checking for any_errors_fatal 34139 1726867645.73664: done checking for any_errors_fatal 34139 1726867645.73664: checking for max_fail_percentage 34139 1726867645.73666: done checking for max_fail_percentage 34139 1726867645.73667: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.73668: done checking to see if all hosts have failed 34139 1726867645.73668: getting the remaining hosts for this loop 34139 1726867645.73669: done getting the remaining hosts for this loop 34139 1726867645.73673: getting the next task for host managed_node1 34139 1726867645.73682: done getting next task for host managed_node1 34139 1726867645.73685: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34139 1726867645.73693: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.73710: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000109 34139 1726867645.73714: WORKER PROCESS EXITING 34139 1726867645.73811: getting variables 34139 1726867645.73813: in VariableManager get_vars() 34139 1726867645.73860: Calling all_inventory to load vars for managed_node1 34139 1726867645.73863: Calling groups_inventory to load vars for managed_node1 34139 1726867645.73865: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.73876: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.73881: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.73883: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.74352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.74950: done with get_vars() 34139 1726867645.74961: done getting variables 34139 1726867645.75016: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:27:25 -0400 (0:00:00.041) 0:00:04.496 ****** 34139 1726867645.75044: entering _queue_task() for managed_node1/service 34139 1726867645.75646: worker is 1 (out of 1 available) 34139 1726867645.75660: exiting _queue_task() for managed_node1/service 34139 1726867645.75674: done queuing things up, now waiting for results queue to drain 34139 1726867645.75675: waiting for pending results... 34139 1726867645.76222: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34139 1726867645.76452: in run() - task 0affcac9-a3a5-c103-b8fd-00000000010a 34139 1726867645.76534: variable 'ansible_search_path' from source: unknown 34139 1726867645.76583: variable 'ansible_search_path' from source: unknown 34139 1726867645.76652: calling self._execute() 34139 1726867645.76885: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.76889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.76892: variable 'omit' from source: magic vars 34139 1726867645.77982: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.77986: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.78682: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.78686: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.78688: when evaluation is False, skipping this task 34139 1726867645.78690: _execute() done 34139 1726867645.78693: dumping result to json 34139 1726867645.78695: done dumping result, returning 34139 1726867645.78698: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-c103-b8fd-00000000010a] 34139 1726867645.78700: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000010a 34139 1726867645.78771: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000010a 34139 1726867645.78775: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.78820: no more pending results, returning what we have 34139 1726867645.78823: results queue empty 34139 1726867645.78824: checking for any_errors_fatal 34139 1726867645.78828: done checking for any_errors_fatal 34139 1726867645.78828: checking for max_fail_percentage 34139 1726867645.78830: done checking for max_fail_percentage 34139 1726867645.78830: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.78831: done checking to see if all hosts have failed 34139 1726867645.78832: getting the remaining hosts for this loop 34139 1726867645.78833: done getting the remaining hosts for this loop 34139 1726867645.78836: getting the next task for host managed_node1 34139 1726867645.78841: done getting next task for host managed_node1 34139 1726867645.78844: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34139 1726867645.78847: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.78863: getting variables 34139 1726867645.78864: in VariableManager get_vars() 34139 1726867645.79006: Calling all_inventory to load vars for managed_node1 34139 1726867645.79011: Calling groups_inventory to load vars for managed_node1 34139 1726867645.79014: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.79022: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.79024: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.79027: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.79421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.79853: done with get_vars() 34139 1726867645.79863: done getting variables 34139 1726867645.80124: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:27:25 -0400 (0:00:00.051) 0:00:04.547 ****** 34139 1726867645.80154: entering _queue_task() for managed_node1/service 34139 1726867645.80588: worker is 1 (out of 1 available) 34139 1726867645.80600: exiting _queue_task() for managed_node1/service 34139 1726867645.80613: done queuing things up, now waiting for results queue to drain 34139 1726867645.80615: waiting for pending results... 34139 1726867645.81139: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 34139 1726867645.81273: in run() - task 0affcac9-a3a5-c103-b8fd-00000000010b 34139 1726867645.81408: variable 'ansible_search_path' from source: unknown 34139 1726867645.81417: variable 'ansible_search_path' from source: unknown 34139 1726867645.81469: calling self._execute() 34139 1726867645.81662: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.81692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.81896: variable 'omit' from source: magic vars 34139 1726867645.83287: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.83304: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.83686: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.83689: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.83692: when evaluation is False, skipping this task 34139 1726867645.83695: _execute() done 34139 1726867645.83697: dumping result to json 34139 1726867645.83699: done dumping result, returning 34139 1726867645.83701: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-c103-b8fd-00000000010b] 34139 1726867645.83703: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000010b 34139 1726867645.83770: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000010b 34139 1726867645.83773: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34139 1726867645.83828: no more pending results, returning what we have 34139 1726867645.83832: results queue empty 34139 1726867645.83832: checking for any_errors_fatal 34139 1726867645.83838: done checking for any_errors_fatal 34139 1726867645.83838: checking for max_fail_percentage 34139 1726867645.83840: done checking for max_fail_percentage 34139 1726867645.83841: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.83841: done checking to see if all hosts have failed 34139 1726867645.83842: getting the remaining hosts for this loop 34139 1726867645.83843: done getting the remaining hosts for this loop 34139 1726867645.83847: getting the next task for host managed_node1 34139 1726867645.83853: done getting next task for host managed_node1 34139 1726867645.83856: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34139 1726867645.83860: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.83875: getting variables 34139 1726867645.83878: in VariableManager get_vars() 34139 1726867645.83920: Calling all_inventory to load vars for managed_node1 34139 1726867645.83922: Calling groups_inventory to load vars for managed_node1 34139 1726867645.83924: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.83932: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.83934: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.83937: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.84913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.85326: done with get_vars() 34139 1726867645.85336: done getting variables 34139 1726867645.85396: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:27:25 -0400 (0:00:00.052) 0:00:04.600 ****** 34139 1726867645.85431: entering _queue_task() for managed_node1/copy 34139 1726867645.86212: worker is 1 (out of 1 available) 34139 1726867645.86222: exiting _queue_task() for managed_node1/copy 34139 1726867645.86231: done queuing things up, now waiting for results queue to drain 34139 1726867645.86233: waiting for pending results... 34139 1726867645.86524: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34139 1726867645.86867: in run() - task 0affcac9-a3a5-c103-b8fd-00000000010c 34139 1726867645.86881: variable 'ansible_search_path' from source: unknown 34139 1726867645.86888: variable 'ansible_search_path' from source: unknown 34139 1726867645.86926: calling self._execute() 34139 1726867645.87004: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.87282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.87286: variable 'omit' from source: magic vars 34139 1726867645.87770: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.88085: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.88113: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.88124: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.88131: when evaluation is False, skipping this task 34139 1726867645.88138: _execute() done 34139 1726867645.88144: dumping result to json 34139 1726867645.88151: done dumping result, returning 34139 1726867645.88161: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-c103-b8fd-00000000010c] 34139 1726867645.88482: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000010c 34139 1726867645.88556: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000010c 34139 1726867645.88561: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.88606: no more pending results, returning what we have 34139 1726867645.88609: results queue empty 34139 1726867645.88610: checking for any_errors_fatal 34139 1726867645.88617: done checking for any_errors_fatal 34139 1726867645.88618: checking for max_fail_percentage 34139 1726867645.88620: done checking for max_fail_percentage 34139 1726867645.88620: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.88621: done checking to see if all hosts have failed 34139 1726867645.88622: getting the remaining hosts for this loop 34139 1726867645.88623: done getting the remaining hosts for this loop 34139 1726867645.88626: getting the next task for host managed_node1 34139 1726867645.88633: done getting next task for host managed_node1 34139 1726867645.88636: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34139 1726867645.88640: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.88654: getting variables 34139 1726867645.88656: in VariableManager get_vars() 34139 1726867645.88692: Calling all_inventory to load vars for managed_node1 34139 1726867645.88694: Calling groups_inventory to load vars for managed_node1 34139 1726867645.88696: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.88704: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.88706: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.88709: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.89085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.89498: done with get_vars() 34139 1726867645.89508: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:27:25 -0400 (0:00:00.043) 0:00:04.644 ****** 34139 1726867645.89806: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34139 1726867645.90286: worker is 1 (out of 1 available) 34139 1726867645.90300: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34139 1726867645.90312: done queuing things up, now waiting for results queue to drain 34139 1726867645.90314: waiting for pending results... 34139 1726867645.91052: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34139 1726867645.91405: in run() - task 0affcac9-a3a5-c103-b8fd-00000000010d 34139 1726867645.91430: variable 'ansible_search_path' from source: unknown 34139 1726867645.91438: variable 'ansible_search_path' from source: unknown 34139 1726867645.91882: calling self._execute() 34139 1726867645.91886: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.91889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.91892: variable 'omit' from source: magic vars 34139 1726867645.92418: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.92596: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.93083: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.93087: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.93090: when evaluation is False, skipping this task 34139 1726867645.93093: _execute() done 34139 1726867645.93095: dumping result to json 34139 1726867645.93097: done dumping result, returning 34139 1726867645.93100: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-c103-b8fd-00000000010d] 34139 1726867645.93102: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000010d 34139 1726867645.93183: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000010d 34139 1726867645.93186: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.93237: no more pending results, returning what we have 34139 1726867645.93240: results queue empty 34139 1726867645.93240: checking for any_errors_fatal 34139 1726867645.93247: done checking for any_errors_fatal 34139 1726867645.93247: checking for max_fail_percentage 34139 1726867645.93249: done checking for max_fail_percentage 34139 1726867645.93250: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.93251: done checking to see if all hosts have failed 34139 1726867645.93251: getting the remaining hosts for this loop 34139 1726867645.93253: done getting the remaining hosts for this loop 34139 1726867645.93257: getting the next task for host managed_node1 34139 1726867645.93265: done getting next task for host managed_node1 34139 1726867645.93268: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34139 1726867645.93273: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.93294: getting variables 34139 1726867645.93296: in VariableManager get_vars() 34139 1726867645.93342: Calling all_inventory to load vars for managed_node1 34139 1726867645.93345: Calling groups_inventory to load vars for managed_node1 34139 1726867645.93347: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.93358: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.93361: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.93363: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.93900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.94339: done with get_vars() 34139 1726867645.94350: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:27:25 -0400 (0:00:00.046) 0:00:04.690 ****** 34139 1726867645.94432: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34139 1726867645.95084: worker is 1 (out of 1 available) 34139 1726867645.95098: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34139 1726867645.95111: done queuing things up, now waiting for results queue to drain 34139 1726867645.95113: waiting for pending results... 34139 1726867645.95701: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 34139 1726867645.95883: in run() - task 0affcac9-a3a5-c103-b8fd-00000000010e 34139 1726867645.95887: variable 'ansible_search_path' from source: unknown 34139 1726867645.95889: variable 'ansible_search_path' from source: unknown 34139 1726867645.95892: calling self._execute() 34139 1726867645.96286: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.96289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.96292: variable 'omit' from source: magic vars 34139 1726867645.97083: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.97087: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867645.97090: variable 'ansible_distribution_major_version' from source: facts 34139 1726867645.97092: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867645.97095: when evaluation is False, skipping this task 34139 1726867645.97097: _execute() done 34139 1726867645.97100: dumping result to json 34139 1726867645.97102: done dumping result, returning 34139 1726867645.97105: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-c103-b8fd-00000000010e] 34139 1726867645.97107: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000010e 34139 1726867645.97180: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000010e 34139 1726867645.97184: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867645.97239: no more pending results, returning what we have 34139 1726867645.97242: results queue empty 34139 1726867645.97243: checking for any_errors_fatal 34139 1726867645.97249: done checking for any_errors_fatal 34139 1726867645.97250: checking for max_fail_percentage 34139 1726867645.97251: done checking for max_fail_percentage 34139 1726867645.97253: checking to see if all hosts have failed and the running result is not ok 34139 1726867645.97254: done checking to see if all hosts have failed 34139 1726867645.97255: getting the remaining hosts for this loop 34139 1726867645.97256: done getting the remaining hosts for this loop 34139 1726867645.97259: getting the next task for host managed_node1 34139 1726867645.97267: done getting next task for host managed_node1 34139 1726867645.97270: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34139 1726867645.97276: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867645.97296: getting variables 34139 1726867645.97298: in VariableManager get_vars() 34139 1726867645.97343: Calling all_inventory to load vars for managed_node1 34139 1726867645.97346: Calling groups_inventory to load vars for managed_node1 34139 1726867645.97348: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867645.97358: Calling all_plugins_play to load vars for managed_node1 34139 1726867645.97360: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867645.97363: Calling groups_plugins_play to load vars for managed_node1 34139 1726867645.97904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867645.98282: done with get_vars() 34139 1726867645.98292: done getting variables 34139 1726867645.98468: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:27:25 -0400 (0:00:00.040) 0:00:04.731 ****** 34139 1726867645.98501: entering _queue_task() for managed_node1/debug 34139 1726867645.98961: worker is 1 (out of 1 available) 34139 1726867645.98973: exiting _queue_task() for managed_node1/debug 34139 1726867645.99092: done queuing things up, now waiting for results queue to drain 34139 1726867645.99094: waiting for pending results... 34139 1726867645.99490: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34139 1726867645.99984: in run() - task 0affcac9-a3a5-c103-b8fd-00000000010f 34139 1726867645.99987: variable 'ansible_search_path' from source: unknown 34139 1726867645.99990: variable 'ansible_search_path' from source: unknown 34139 1726867645.99993: calling self._execute() 34139 1726867645.99995: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867645.99997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867645.99999: variable 'omit' from source: magic vars 34139 1726867646.00711: variable 'ansible_distribution_major_version' from source: facts 34139 1726867646.00728: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867646.01087: variable 'ansible_distribution_major_version' from source: facts 34139 1726867646.01282: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867646.01286: when evaluation is False, skipping this task 34139 1726867646.01289: _execute() done 34139 1726867646.01291: dumping result to json 34139 1726867646.01294: done dumping result, returning 34139 1726867646.01296: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-c103-b8fd-00000000010f] 34139 1726867646.01298: sending task result for task 0affcac9-a3a5-c103-b8fd-00000000010f 34139 1726867646.01350: done sending task result for task 0affcac9-a3a5-c103-b8fd-00000000010f 34139 1726867646.01353: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867646.01502: no more pending results, returning what we have 34139 1726867646.01505: results queue empty 34139 1726867646.01506: checking for any_errors_fatal 34139 1726867646.01513: done checking for any_errors_fatal 34139 1726867646.01514: checking for max_fail_percentage 34139 1726867646.01516: done checking for max_fail_percentage 34139 1726867646.01517: checking to see if all hosts have failed and the running result is not ok 34139 1726867646.01518: done checking to see if all hosts have failed 34139 1726867646.01519: getting the remaining hosts for this loop 34139 1726867646.01520: done getting the remaining hosts for this loop 34139 1726867646.01523: getting the next task for host managed_node1 34139 1726867646.01530: done getting next task for host managed_node1 34139 1726867646.01533: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34139 1726867646.01538: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867646.01554: getting variables 34139 1726867646.01556: in VariableManager get_vars() 34139 1726867646.01693: Calling all_inventory to load vars for managed_node1 34139 1726867646.01698: Calling groups_inventory to load vars for managed_node1 34139 1726867646.01700: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867646.01711: Calling all_plugins_play to load vars for managed_node1 34139 1726867646.01714: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867646.01717: Calling groups_plugins_play to load vars for managed_node1 34139 1726867646.02117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867646.02532: done with get_vars() 34139 1726867646.02656: done getting variables 34139 1726867646.02718: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:27:26 -0400 (0:00:00.042) 0:00:04.773 ****** 34139 1726867646.02750: entering _queue_task() for managed_node1/debug 34139 1726867646.03132: worker is 1 (out of 1 available) 34139 1726867646.03144: exiting _queue_task() for managed_node1/debug 34139 1726867646.03154: done queuing things up, now waiting for results queue to drain 34139 1726867646.03156: waiting for pending results... 34139 1726867646.03750: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34139 1726867646.03895: in run() - task 0affcac9-a3a5-c103-b8fd-000000000110 34139 1726867646.03923: variable 'ansible_search_path' from source: unknown 34139 1726867646.03970: variable 'ansible_search_path' from source: unknown 34139 1726867646.04081: calling self._execute() 34139 1726867646.04382: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867646.04386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867646.04389: variable 'omit' from source: magic vars 34139 1726867646.05114: variable 'ansible_distribution_major_version' from source: facts 34139 1726867646.05139: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867646.05439: variable 'ansible_distribution_major_version' from source: facts 34139 1726867646.05783: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867646.05786: when evaluation is False, skipping this task 34139 1726867646.05788: _execute() done 34139 1726867646.05791: dumping result to json 34139 1726867646.05794: done dumping result, returning 34139 1726867646.05799: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-c103-b8fd-000000000110] 34139 1726867646.05802: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000110 34139 1726867646.05861: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000110 34139 1726867646.05864: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867646.05906: no more pending results, returning what we have 34139 1726867646.05911: results queue empty 34139 1726867646.05912: checking for any_errors_fatal 34139 1726867646.05916: done checking for any_errors_fatal 34139 1726867646.05917: checking for max_fail_percentage 34139 1726867646.05919: done checking for max_fail_percentage 34139 1726867646.05919: checking to see if all hosts have failed and the running result is not ok 34139 1726867646.05920: done checking to see if all hosts have failed 34139 1726867646.05921: getting the remaining hosts for this loop 34139 1726867646.05922: done getting the remaining hosts for this loop 34139 1726867646.05925: getting the next task for host managed_node1 34139 1726867646.05930: done getting next task for host managed_node1 34139 1726867646.05934: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34139 1726867646.05938: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867646.05953: getting variables 34139 1726867646.05954: in VariableManager get_vars() 34139 1726867646.05997: Calling all_inventory to load vars for managed_node1 34139 1726867646.05999: Calling groups_inventory to load vars for managed_node1 34139 1726867646.06002: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867646.06013: Calling all_plugins_play to load vars for managed_node1 34139 1726867646.06015: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867646.06019: Calling groups_plugins_play to load vars for managed_node1 34139 1726867646.06664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867646.07098: done with get_vars() 34139 1726867646.07111: done getting variables 34139 1726867646.07166: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:27:26 -0400 (0:00:00.046) 0:00:04.820 ****** 34139 1726867646.07403: entering _queue_task() for managed_node1/debug 34139 1726867646.07844: worker is 1 (out of 1 available) 34139 1726867646.07856: exiting _queue_task() for managed_node1/debug 34139 1726867646.07867: done queuing things up, now waiting for results queue to drain 34139 1726867646.07868: waiting for pending results... 34139 1726867646.08395: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34139 1726867646.08441: in run() - task 0affcac9-a3a5-c103-b8fd-000000000111 34139 1726867646.08454: variable 'ansible_search_path' from source: unknown 34139 1726867646.08458: variable 'ansible_search_path' from source: unknown 34139 1726867646.08794: calling self._execute() 34139 1726867646.08864: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867646.08868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867646.08965: variable 'omit' from source: magic vars 34139 1726867646.09618: variable 'ansible_distribution_major_version' from source: facts 34139 1726867646.09629: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867646.09737: variable 'ansible_distribution_major_version' from source: facts 34139 1726867646.09743: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867646.09746: when evaluation is False, skipping this task 34139 1726867646.09749: _execute() done 34139 1726867646.09751: dumping result to json 34139 1726867646.09756: done dumping result, returning 34139 1726867646.09764: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-c103-b8fd-000000000111] 34139 1726867646.09769: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000111 34139 1726867646.10064: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000111 34139 1726867646.10067: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34139 1726867646.10136: no more pending results, returning what we have 34139 1726867646.10139: results queue empty 34139 1726867646.10140: checking for any_errors_fatal 34139 1726867646.10144: done checking for any_errors_fatal 34139 1726867646.10144: checking for max_fail_percentage 34139 1726867646.10146: done checking for max_fail_percentage 34139 1726867646.10146: checking to see if all hosts have failed and the running result is not ok 34139 1726867646.10147: done checking to see if all hosts have failed 34139 1726867646.10148: getting the remaining hosts for this loop 34139 1726867646.10149: done getting the remaining hosts for this loop 34139 1726867646.10152: getting the next task for host managed_node1 34139 1726867646.10157: done getting next task for host managed_node1 34139 1726867646.10160: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34139 1726867646.10164: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867646.10181: getting variables 34139 1726867646.10183: in VariableManager get_vars() 34139 1726867646.10221: Calling all_inventory to load vars for managed_node1 34139 1726867646.10224: Calling groups_inventory to load vars for managed_node1 34139 1726867646.10226: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867646.10234: Calling all_plugins_play to load vars for managed_node1 34139 1726867646.10236: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867646.10239: Calling groups_plugins_play to load vars for managed_node1 34139 1726867646.10627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867646.11062: done with get_vars() 34139 1726867646.11072: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:27:26 -0400 (0:00:00.039) 0:00:04.859 ****** 34139 1726867646.11364: entering _queue_task() for managed_node1/ping 34139 1726867646.11799: worker is 1 (out of 1 available) 34139 1726867646.11812: exiting _queue_task() for managed_node1/ping 34139 1726867646.11822: done queuing things up, now waiting for results queue to drain 34139 1726867646.11824: waiting for pending results... 34139 1726867646.12064: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 34139 1726867646.12136: in run() - task 0affcac9-a3a5-c103-b8fd-000000000112 34139 1726867646.12161: variable 'ansible_search_path' from source: unknown 34139 1726867646.12168: variable 'ansible_search_path' from source: unknown 34139 1726867646.12211: calling self._execute() 34139 1726867646.12310: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867646.12323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867646.12384: variable 'omit' from source: magic vars 34139 1726867646.12815: variable 'ansible_distribution_major_version' from source: facts 34139 1726867646.12836: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867646.12961: variable 'ansible_distribution_major_version' from source: facts 34139 1726867646.12971: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867646.12980: when evaluation is False, skipping this task 34139 1726867646.12988: _execute() done 34139 1726867646.13026: dumping result to json 34139 1726867646.13030: done dumping result, returning 34139 1726867646.13033: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-c103-b8fd-000000000112] 34139 1726867646.13035: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000112 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867646.13176: no more pending results, returning what we have 34139 1726867646.13182: results queue empty 34139 1726867646.13182: checking for any_errors_fatal 34139 1726867646.13187: done checking for any_errors_fatal 34139 1726867646.13187: checking for max_fail_percentage 34139 1726867646.13189: done checking for max_fail_percentage 34139 1726867646.13190: checking to see if all hosts have failed and the running result is not ok 34139 1726867646.13191: done checking to see if all hosts have failed 34139 1726867646.13192: getting the remaining hosts for this loop 34139 1726867646.13193: done getting the remaining hosts for this loop 34139 1726867646.13196: getting the next task for host managed_node1 34139 1726867646.13205: done getting next task for host managed_node1 34139 1726867646.13209: ^ task is: TASK: meta (role_complete) 34139 1726867646.13213: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867646.13231: getting variables 34139 1726867646.13233: in VariableManager get_vars() 34139 1726867646.13276: Calling all_inventory to load vars for managed_node1 34139 1726867646.13280: Calling groups_inventory to load vars for managed_node1 34139 1726867646.13283: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867646.13294: Calling all_plugins_play to load vars for managed_node1 34139 1726867646.13297: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867646.13300: Calling groups_plugins_play to load vars for managed_node1 34139 1726867646.13731: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000112 34139 1726867646.13735: WORKER PROCESS EXITING 34139 1726867646.13756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867646.13997: done with get_vars() 34139 1726867646.14006: done getting variables 34139 1726867646.14089: done queuing things up, now waiting for results queue to drain 34139 1726867646.14091: results queue empty 34139 1726867646.14092: checking for any_errors_fatal 34139 1726867646.14094: done checking for any_errors_fatal 34139 1726867646.14095: checking for max_fail_percentage 34139 1726867646.14095: done checking for max_fail_percentage 34139 1726867646.14096: checking to see if all hosts have failed and the running result is not ok 34139 1726867646.14097: done checking to see if all hosts have failed 34139 1726867646.14097: getting the remaining hosts for this loop 34139 1726867646.14098: done getting the remaining hosts for this loop 34139 1726867646.14100: getting the next task for host managed_node1 34139 1726867646.14104: done getting next task for host managed_node1 34139 1726867646.14106: ^ task is: TASK: Include the task 'cleanup_mock_wifi.yml' 34139 1726867646.14111: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867646.14113: getting variables 34139 1726867646.14114: in VariableManager get_vars() 34139 1726867646.14133: Calling all_inventory to load vars for managed_node1 34139 1726867646.14135: Calling groups_inventory to load vars for managed_node1 34139 1726867646.14137: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867646.14141: Calling all_plugins_play to load vars for managed_node1 34139 1726867646.14144: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867646.14147: Calling groups_plugins_play to load vars for managed_node1 34139 1726867646.14296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867646.14501: done with get_vars() 34139 1726867646.14511: done getting variables TASK [Include the task 'cleanup_mock_wifi.yml'] ******************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:96 Friday 20 September 2024 17:27:26 -0400 (0:00:00.032) 0:00:04.891 ****** 34139 1726867646.14570: entering _queue_task() for managed_node1/include_tasks 34139 1726867646.14914: worker is 1 (out of 1 available) 34139 1726867646.14927: exiting _queue_task() for managed_node1/include_tasks 34139 1726867646.14937: done queuing things up, now waiting for results queue to drain 34139 1726867646.14938: waiting for pending results... 34139 1726867646.15116: running TaskExecutor() for managed_node1/TASK: Include the task 'cleanup_mock_wifi.yml' 34139 1726867646.15229: in run() - task 0affcac9-a3a5-c103-b8fd-000000000142 34139 1726867646.15249: variable 'ansible_search_path' from source: unknown 34139 1726867646.15298: calling self._execute() 34139 1726867646.15399: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867646.15413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867646.15430: variable 'omit' from source: magic vars 34139 1726867646.15899: variable 'ansible_distribution_major_version' from source: facts 34139 1726867646.15922: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867646.16053: variable 'ansible_distribution_major_version' from source: facts 34139 1726867646.16065: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867646.16082: when evaluation is False, skipping this task 34139 1726867646.16085: _execute() done 34139 1726867646.16183: dumping result to json 34139 1726867646.16186: done dumping result, returning 34139 1726867646.16188: done running TaskExecutor() for managed_node1/TASK: Include the task 'cleanup_mock_wifi.yml' [0affcac9-a3a5-c103-b8fd-000000000142] 34139 1726867646.16190: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000142 34139 1726867646.16265: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000142 34139 1726867646.16268: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867646.16325: no more pending results, returning what we have 34139 1726867646.16328: results queue empty 34139 1726867646.16329: checking for any_errors_fatal 34139 1726867646.16330: done checking for any_errors_fatal 34139 1726867646.16331: checking for max_fail_percentage 34139 1726867646.16332: done checking for max_fail_percentage 34139 1726867646.16333: checking to see if all hosts have failed and the running result is not ok 34139 1726867646.16334: done checking to see if all hosts have failed 34139 1726867646.16335: getting the remaining hosts for this loop 34139 1726867646.16336: done getting the remaining hosts for this loop 34139 1726867646.16340: getting the next task for host managed_node1 34139 1726867646.16347: done getting next task for host managed_node1 34139 1726867646.16349: ^ task is: TASK: Verify network state restored to default 34139 1726867646.16352: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34139 1726867646.16355: getting variables 34139 1726867646.16357: in VariableManager get_vars() 34139 1726867646.16400: Calling all_inventory to load vars for managed_node1 34139 1726867646.16403: Calling groups_inventory to load vars for managed_node1 34139 1726867646.16405: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867646.16529: Calling all_plugins_play to load vars for managed_node1 34139 1726867646.16532: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867646.16536: Calling groups_plugins_play to load vars for managed_node1 34139 1726867646.16811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867646.17039: done with get_vars() 34139 1726867646.17049: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:98 Friday 20 September 2024 17:27:26 -0400 (0:00:00.026) 0:00:04.918 ****** 34139 1726867646.17250: entering _queue_task() for managed_node1/include_tasks 34139 1726867646.17712: worker is 1 (out of 1 available) 34139 1726867646.17792: exiting _queue_task() for managed_node1/include_tasks 34139 1726867646.17802: done queuing things up, now waiting for results queue to drain 34139 1726867646.17804: waiting for pending results... 34139 1726867646.18433: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 34139 1726867646.18484: in run() - task 0affcac9-a3a5-c103-b8fd-000000000143 34139 1726867646.18493: variable 'ansible_search_path' from source: unknown 34139 1726867646.18552: calling self._execute() 34139 1726867646.18645: variable 'ansible_host' from source: host vars for 'managed_node1' 34139 1726867646.18683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34139 1726867646.18686: variable 'omit' from source: magic vars 34139 1726867646.19042: variable 'ansible_distribution_major_version' from source: facts 34139 1726867646.19060: Evaluated conditional (ansible_distribution_major_version != '6'): True 34139 1726867646.19181: variable 'ansible_distribution_major_version' from source: facts 34139 1726867646.19253: Evaluated conditional (ansible_distribution_major_version == '7'): False 34139 1726867646.19256: when evaluation is False, skipping this task 34139 1726867646.19259: _execute() done 34139 1726867646.19261: dumping result to json 34139 1726867646.19263: done dumping result, returning 34139 1726867646.19265: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0affcac9-a3a5-c103-b8fd-000000000143] 34139 1726867646.19267: sending task result for task 0affcac9-a3a5-c103-b8fd-000000000143 34139 1726867646.19336: done sending task result for task 0affcac9-a3a5-c103-b8fd-000000000143 34139 1726867646.19340: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34139 1726867646.19392: no more pending results, returning what we have 34139 1726867646.19395: results queue empty 34139 1726867646.19396: checking for any_errors_fatal 34139 1726867646.19403: done checking for any_errors_fatal 34139 1726867646.19404: checking for max_fail_percentage 34139 1726867646.19405: done checking for max_fail_percentage 34139 1726867646.19406: checking to see if all hosts have failed and the running result is not ok 34139 1726867646.19409: done checking to see if all hosts have failed 34139 1726867646.19410: getting the remaining hosts for this loop 34139 1726867646.19411: done getting the remaining hosts for this loop 34139 1726867646.19414: getting the next task for host managed_node1 34139 1726867646.19423: done getting next task for host managed_node1 34139 1726867646.19425: ^ task is: TASK: meta (flush_handlers) 34139 1726867646.19427: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867646.19431: getting variables 34139 1726867646.19433: in VariableManager get_vars() 34139 1726867646.19475: Calling all_inventory to load vars for managed_node1 34139 1726867646.19479: Calling groups_inventory to load vars for managed_node1 34139 1726867646.19482: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867646.19493: Calling all_plugins_play to load vars for managed_node1 34139 1726867646.19496: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867646.19498: Calling groups_plugins_play to load vars for managed_node1 34139 1726867646.20373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867646.20660: done with get_vars() 34139 1726867646.20670: done getting variables 34139 1726867646.20735: in VariableManager get_vars() 34139 1726867646.20764: Calling all_inventory to load vars for managed_node1 34139 1726867646.20767: Calling groups_inventory to load vars for managed_node1 34139 1726867646.20768: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867646.20773: Calling all_plugins_play to load vars for managed_node1 34139 1726867646.20775: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867646.20779: Calling groups_plugins_play to load vars for managed_node1 34139 1726867646.21030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867646.21288: done with get_vars() 34139 1726867646.21301: done queuing things up, now waiting for results queue to drain 34139 1726867646.21303: results queue empty 34139 1726867646.21304: checking for any_errors_fatal 34139 1726867646.21306: done checking for any_errors_fatal 34139 1726867646.21309: checking for max_fail_percentage 34139 1726867646.21310: done checking for max_fail_percentage 34139 1726867646.21311: checking to see if all hosts have failed and the running result is not ok 34139 1726867646.21312: done checking to see if all hosts have failed 34139 1726867646.21313: getting the remaining hosts for this loop 34139 1726867646.21314: done getting the remaining hosts for this loop 34139 1726867646.21316: getting the next task for host managed_node1 34139 1726867646.21320: done getting next task for host managed_node1 34139 1726867646.21321: ^ task is: TASK: meta (flush_handlers) 34139 1726867646.21322: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867646.21324: getting variables 34139 1726867646.21325: in VariableManager get_vars() 34139 1726867646.21341: Calling all_inventory to load vars for managed_node1 34139 1726867646.21343: Calling groups_inventory to load vars for managed_node1 34139 1726867646.21345: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867646.21350: Calling all_plugins_play to load vars for managed_node1 34139 1726867646.21352: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867646.21355: Calling groups_plugins_play to load vars for managed_node1 34139 1726867646.21522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867646.22000: done with get_vars() 34139 1726867646.22010: done getting variables 34139 1726867646.22054: in VariableManager get_vars() 34139 1726867646.22068: Calling all_inventory to load vars for managed_node1 34139 1726867646.22070: Calling groups_inventory to load vars for managed_node1 34139 1726867646.22071: Calling all_plugins_inventory to load vars for managed_node1 34139 1726867646.22075: Calling all_plugins_play to load vars for managed_node1 34139 1726867646.22079: Calling groups_plugins_inventory to load vars for managed_node1 34139 1726867646.22081: Calling groups_plugins_play to load vars for managed_node1 34139 1726867646.22517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34139 1726867646.22925: done with get_vars() 34139 1726867646.22936: done queuing things up, now waiting for results queue to drain 34139 1726867646.22938: results queue empty 34139 1726867646.22939: checking for any_errors_fatal 34139 1726867646.22940: done checking for any_errors_fatal 34139 1726867646.22940: checking for max_fail_percentage 34139 1726867646.22941: done checking for max_fail_percentage 34139 1726867646.22942: checking to see if all hosts have failed and the running result is not ok 34139 1726867646.22942: done checking to see if all hosts have failed 34139 1726867646.22943: getting the remaining hosts for this loop 34139 1726867646.22944: done getting the remaining hosts for this loop 34139 1726867646.22951: getting the next task for host managed_node1 34139 1726867646.22954: done getting next task for host managed_node1 34139 1726867646.22955: ^ task is: None 34139 1726867646.22956: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34139 1726867646.22957: done queuing things up, now waiting for results queue to drain 34139 1726867646.22958: results queue empty 34139 1726867646.22959: checking for any_errors_fatal 34139 1726867646.22959: done checking for any_errors_fatal 34139 1726867646.22960: checking for max_fail_percentage 34139 1726867646.22961: done checking for max_fail_percentage 34139 1726867646.22961: checking to see if all hosts have failed and the running result is not ok 34139 1726867646.22962: done checking to see if all hosts have failed 34139 1726867646.22964: getting the next task for host managed_node1 34139 1726867646.22966: done getting next task for host managed_node1 34139 1726867646.22966: ^ task is: None 34139 1726867646.22967: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=7 changed=0 unreachable=0 failed=0 skipped=102 rescued=0 ignored=0 Friday 20 September 2024 17:27:26 -0400 (0:00:00.058) 0:00:04.977 ****** =============================================================================== Gathering Facts --------------------------------------------------------- 1.26s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:6 Gather the minimum subset of ansible_facts required by the network role test --- 0.63s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.49s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Include the task 'enable_epel.yml' -------------------------------------- 0.06s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Verify network state restored to default -------------------------------- 0.06s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:98 Set network provider to 'nm' -------------------------------------------- 0.06s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:13 fedora.linux_system_roles.network : Enable network service -------------- 0.05s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable --- 0.05s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 fedora.linux_system_roles.network : Enable and start wpa_supplicant ----- 0.05s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 fedora.linux_system_roles.network : Show debug messages for the network_connections --- 0.05s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.05s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Ensure initscripts network file dependency is present --- 0.04s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces --- 0.04s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 fedora.linux_system_roles.network : Show stderr messages for the network_connections --- 0.04s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later --- 0.04s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider --- 0.04s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.04s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking state ---------- 0.04s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 fedora.linux_system_roles.network : Show debug messages for the network_state --- 0.04s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Copy client certs ------------------------------------------------------- 0.04s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:13 34139 1726867646.23156: RUNNING CLEANUP