[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 34006 1726882659.60256: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-spT executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 34006 1726882659.60543: Added group all to inventory 34006 1726882659.60545: Added group ungrouped to inventory 34006 1726882659.60547: Group all now contains ungrouped 34006 1726882659.60549: Examining possible inventory source: /tmp/network-Kc3/inventory.yml 34006 1726882659.70765: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 34006 1726882659.70856: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 34006 1726882659.70882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 34006 1726882659.70959: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 34006 1726882659.71055: Loaded config def from plugin (inventory/script) 34006 1726882659.71057: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 34006 1726882659.71104: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 34006 1726882659.71222: Loaded config def from plugin (inventory/yaml) 34006 1726882659.71224: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 34006 1726882659.71323: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 34006 1726882659.71872: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 34006 1726882659.71875: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 34006 1726882659.71878: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 34006 1726882659.71884: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 34006 1726882659.71891: Loading data from /tmp/network-Kc3/inventory.yml 34006 1726882659.72013: /tmp/network-Kc3/inventory.yml was not parsable by auto 34006 1726882659.72084: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 34006 1726882659.72153: Loading data from /tmp/network-Kc3/inventory.yml 34006 1726882659.72256: group all already in inventory 34006 1726882659.72263: set inventory_file for managed_node1 34006 1726882659.72268: set inventory_dir for managed_node1 34006 1726882659.72269: Added host managed_node1 to inventory 34006 1726882659.72271: Added host managed_node1 to group all 34006 1726882659.72272: set ansible_host for managed_node1 34006 1726882659.72273: set ansible_ssh_extra_args for managed_node1 34006 1726882659.72276: set inventory_file for managed_node2 34006 1726882659.72279: set inventory_dir for managed_node2 34006 1726882659.72281: Added host managed_node2 to inventory 34006 1726882659.72282: Added host managed_node2 to group all 34006 1726882659.72283: set ansible_host for managed_node2 34006 1726882659.72284: set ansible_ssh_extra_args for managed_node2 34006 1726882659.72289: set inventory_file for managed_node3 34006 1726882659.72292: set inventory_dir for managed_node3 34006 1726882659.72292: Added host managed_node3 to inventory 34006 1726882659.72295: Added host managed_node3 to group all 34006 1726882659.72296: set ansible_host for managed_node3 34006 1726882659.72297: set ansible_ssh_extra_args for managed_node3 34006 1726882659.72299: Reconcile groups and hosts in inventory. 34006 1726882659.72303: Group ungrouped now contains managed_node1 34006 1726882659.72305: Group ungrouped now contains managed_node2 34006 1726882659.72310: Group ungrouped now contains managed_node3 34006 1726882659.72410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 34006 1726882659.72561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 34006 1726882659.72627: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 34006 1726882659.72656: Loaded config def from plugin (vars/host_group_vars) 34006 1726882659.72659: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 34006 1726882659.72666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 34006 1726882659.72675: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 34006 1726882659.72723: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 34006 1726882659.73152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882659.73265: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 34006 1726882659.73308: Loaded config def from plugin (connection/local) 34006 1726882659.73311: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 34006 1726882659.74021: Loaded config def from plugin (connection/paramiko_ssh) 34006 1726882659.74025: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 34006 1726882659.75015: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34006 1726882659.75054: Loaded config def from plugin (connection/psrp) 34006 1726882659.75056: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 34006 1726882659.75858: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34006 1726882659.75900: Loaded config def from plugin (connection/ssh) 34006 1726882659.75902: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 34006 1726882659.79152: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34006 1726882659.79260: Loaded config def from plugin (connection/winrm) 34006 1726882659.79264: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 34006 1726882659.79298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 34006 1726882659.79446: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 34006 1726882659.79658: Loaded config def from plugin (shell/cmd) 34006 1726882659.79661: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 34006 1726882659.79688: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 34006 1726882659.79829: Loaded config def from plugin (shell/powershell) 34006 1726882659.79832: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 34006 1726882659.79895: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 34006 1726882659.80171: Loaded config def from plugin (shell/sh) 34006 1726882659.80177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 34006 1726882659.80247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 34006 1726882659.80372: Loaded config def from plugin (become/runas) 34006 1726882659.80375: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 34006 1726882659.80526: Loaded config def from plugin (become/su) 34006 1726882659.80528: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 34006 1726882659.80675: Loaded config def from plugin (become/sudo) 34006 1726882659.80678: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 34006 1726882659.80712: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml 34006 1726882659.81038: in VariableManager get_vars() 34006 1726882659.81059: done with get_vars() 34006 1726882659.81175: trying /usr/local/lib/python3.12/site-packages/ansible/modules 34006 1726882659.84959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 34006 1726882659.85075: in VariableManager get_vars() 34006 1726882659.85080: done with get_vars() 34006 1726882659.85083: variable 'playbook_dir' from source: magic vars 34006 1726882659.85084: variable 'ansible_playbook_python' from source: magic vars 34006 1726882659.85084: variable 'ansible_config_file' from source: magic vars 34006 1726882659.85085: variable 'groups' from source: magic vars 34006 1726882659.85086: variable 'omit' from source: magic vars 34006 1726882659.85087: variable 'ansible_version' from source: magic vars 34006 1726882659.85087: variable 'ansible_check_mode' from source: magic vars 34006 1726882659.85088: variable 'ansible_diff_mode' from source: magic vars 34006 1726882659.85089: variable 'ansible_forks' from source: magic vars 34006 1726882659.85089: variable 'ansible_inventory_sources' from source: magic vars 34006 1726882659.85090: variable 'ansible_skip_tags' from source: magic vars 34006 1726882659.85091: variable 'ansible_limit' from source: magic vars 34006 1726882659.85091: variable 'ansible_run_tags' from source: magic vars 34006 1726882659.85092: variable 'ansible_verbosity' from source: magic vars 34006 1726882659.85330: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml 34006 1726882659.86508: in VariableManager get_vars() 34006 1726882659.86526: done with get_vars() 34006 1726882659.86648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 34006 1726882659.87036: in VariableManager get_vars() 34006 1726882659.87049: done with get_vars() 34006 1726882659.87054: variable 'omit' from source: magic vars 34006 1726882659.87072: variable 'omit' from source: magic vars 34006 1726882659.87307: in VariableManager get_vars() 34006 1726882659.87319: done with get_vars() 34006 1726882659.87363: in VariableManager get_vars() 34006 1726882659.87375: done with get_vars() 34006 1726882659.87411: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34006 1726882659.87826: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34006 1726882659.88156: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34006 1726882659.89346: in VariableManager get_vars() 34006 1726882659.89363: done with get_vars() 34006 1726882659.90587: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 34006 1726882659.91122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34006 1726882659.94763: in VariableManager get_vars() 34006 1726882659.94783: done with get_vars() 34006 1726882659.94788: variable 'omit' from source: magic vars 34006 1726882659.95002: variable 'omit' from source: magic vars 34006 1726882659.95035: in VariableManager get_vars() 34006 1726882659.95064: done with get_vars() 34006 1726882659.95085: in VariableManager get_vars() 34006 1726882659.95100: done with get_vars() 34006 1726882659.95128: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34006 1726882659.95444: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34006 1726882659.95521: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34006 1726882660.00524: in VariableManager get_vars() 34006 1726882660.00549: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34006 1726882660.04330: in VariableManager get_vars() 34006 1726882660.04351: done with get_vars() 34006 1726882660.04356: variable 'omit' from source: magic vars 34006 1726882660.04368: variable 'omit' from source: magic vars 34006 1726882660.04604: in VariableManager get_vars() 34006 1726882660.04622: done with get_vars() 34006 1726882660.04642: in VariableManager get_vars() 34006 1726882660.04658: done with get_vars() 34006 1726882660.04686: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34006 1726882660.05024: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34006 1726882660.05101: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34006 1726882660.05891: in VariableManager get_vars() 34006 1726882660.05916: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34006 1726882660.09767: in VariableManager get_vars() 34006 1726882660.09795: done with get_vars() 34006 1726882660.09800: variable 'omit' from source: magic vars 34006 1726882660.09825: variable 'omit' from source: magic vars 34006 1726882660.09862: in VariableManager get_vars() 34006 1726882660.09882: done with get_vars() 34006 1726882660.09903: in VariableManager get_vars() 34006 1726882660.09925: done with get_vars() 34006 1726882660.09955: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34006 1726882660.10385: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34006 1726882660.10454: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34006 1726882660.11337: in VariableManager get_vars() 34006 1726882660.11363: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34006 1726882660.17430: in VariableManager get_vars() 34006 1726882660.17460: done with get_vars() 34006 1726882660.17902: in VariableManager get_vars() 34006 1726882660.17928: done with get_vars() 34006 1726882660.17990: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 34006 1726882660.18012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 34006 1726882660.18485: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 34006 1726882660.18941: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 34006 1726882660.18944: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 34006 1726882660.18974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 34006 1726882660.19103: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 34006 1726882660.19266: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 34006 1726882660.19529: Loaded config def from plugin (callback/default) 34006 1726882660.19532: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34006 1726882660.21921: Loaded config def from plugin (callback/junit) 34006 1726882660.21924: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34006 1726882660.21966: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 34006 1726882660.22031: Loaded config def from plugin (callback/minimal) 34006 1726882660.22034: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34006 1726882660.22071: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34006 1726882660.22331: Loaded config def from plugin (callback/tree) 34006 1726882660.22333: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 34006 1726882660.22444: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 34006 1726882660.22447: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_wireless_nm.yml ************************************************ 2 plays in /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml 34006 1726882660.22472: in VariableManager get_vars() 34006 1726882660.22484: done with get_vars() 34006 1726882660.22489: in VariableManager get_vars() 34006 1726882660.22703: done with get_vars() 34006 1726882660.22708: variable 'omit' from source: magic vars 34006 1726882660.22743: in VariableManager get_vars() 34006 1726882660.22757: done with get_vars() 34006 1726882660.22778: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_wireless.yml' with nm as provider] ********* 34006 1726882660.24067: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 34006 1726882660.24139: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 34006 1726882660.24600: getting the remaining hosts for this loop 34006 1726882660.24602: done getting the remaining hosts for this loop 34006 1726882660.24605: getting the next task for host managed_node3 34006 1726882660.24609: done getting next task for host managed_node3 34006 1726882660.24611: ^ task is: TASK: Gathering Facts 34006 1726882660.24612: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882660.24615: getting variables 34006 1726882660.24616: in VariableManager get_vars() 34006 1726882660.24625: Calling all_inventory to load vars for managed_node3 34006 1726882660.24628: Calling groups_inventory to load vars for managed_node3 34006 1726882660.24630: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882660.24642: Calling all_plugins_play to load vars for managed_node3 34006 1726882660.24653: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882660.24656: Calling groups_plugins_play to load vars for managed_node3 34006 1726882660.24691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882660.24747: done with get_vars() 34006 1726882660.24753: done getting variables 34006 1726882660.24964: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:6 Friday 20 September 2024 21:37:40 -0400 (0:00:00.026) 0:00:00.026 ****** 34006 1726882660.24985: entering _queue_task() for managed_node3/gather_facts 34006 1726882660.24987: Creating lock for gather_facts 34006 1726882660.25886: worker is 1 (out of 1 available) 34006 1726882660.25897: exiting _queue_task() for managed_node3/gather_facts 34006 1726882660.25910: done queuing things up, now waiting for results queue to drain 34006 1726882660.25912: waiting for pending results... 34006 1726882660.26571: running TaskExecutor() for managed_node3/TASK: Gathering Facts 34006 1726882660.26577: in run() - task 12673a56-9f93-11ce-7734-000000000147 34006 1726882660.26581: variable 'ansible_search_path' from source: unknown 34006 1726882660.26880: calling self._execute() 34006 1726882660.26883: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882660.26886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882660.26992: variable 'omit' from source: magic vars 34006 1726882660.27229: variable 'omit' from source: magic vars 34006 1726882660.27252: variable 'omit' from source: magic vars 34006 1726882660.27326: variable 'omit' from source: magic vars 34006 1726882660.27510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34006 1726882660.27552: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34006 1726882660.27629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34006 1726882660.27652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34006 1726882660.27684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34006 1726882660.27748: variable 'inventory_hostname' from source: host vars for 'managed_node3' 34006 1726882660.27805: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882660.27814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882660.28400: Set connection var ansible_pipelining to False 34006 1726882660.28403: Set connection var ansible_shell_executable to /bin/sh 34006 1726882660.28407: Set connection var ansible_timeout to 10 34006 1726882660.28409: Set connection var ansible_connection to ssh 34006 1726882660.28412: Set connection var ansible_module_compression to ZIP_DEFLATED 34006 1726882660.28414: Set connection var ansible_shell_type to sh 34006 1726882660.28417: variable 'ansible_shell_executable' from source: unknown 34006 1726882660.28419: variable 'ansible_connection' from source: unknown 34006 1726882660.28421: variable 'ansible_module_compression' from source: unknown 34006 1726882660.28423: variable 'ansible_shell_type' from source: unknown 34006 1726882660.28426: variable 'ansible_shell_executable' from source: unknown 34006 1726882660.28428: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882660.28431: variable 'ansible_pipelining' from source: unknown 34006 1726882660.28433: variable 'ansible_timeout' from source: unknown 34006 1726882660.28435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882660.28758: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34006 1726882660.28848: variable 'omit' from source: magic vars 34006 1726882660.28859: starting attempt loop 34006 1726882660.28866: running the handler 34006 1726882660.28887: variable 'ansible_facts' from source: unknown 34006 1726882660.28913: _low_level_execute_command(): starting 34006 1726882660.28953: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34006 1726882660.30369: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34006 1726882660.30385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 34006 1726882660.30509: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882660.30594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 34006 1726882660.30619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882660.30750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34006 1726882660.32465: stdout chunk (state=3): >>>/root <<< 34006 1726882660.32598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34006 1726882660.32610: stdout chunk (state=3): >>><<< 34006 1726882660.32623: stderr chunk (state=3): >>><<< 34006 1726882660.32650: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34006 1726882660.32799: _low_level_execute_command(): starting 34006 1726882660.32803: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882660.3274248-34038-64598998840451 `" && echo ansible-tmp-1726882660.3274248-34038-64598998840451="` echo /root/.ansible/tmp/ansible-tmp-1726882660.3274248-34038-64598998840451 `" ) && sleep 0' 34006 1726882660.33811: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34006 1726882660.33873: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34006 1726882660.33888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882660.34010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882660.34116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882660.34311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34006 1726882660.36069: stdout chunk (state=3): >>>ansible-tmp-1726882660.3274248-34038-64598998840451=/root/.ansible/tmp/ansible-tmp-1726882660.3274248-34038-64598998840451 <<< 34006 1726882660.36176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34006 1726882660.36260: stderr chunk (state=3): >>><<< 34006 1726882660.36268: stdout chunk (state=3): >>><<< 34006 1726882660.36470: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882660.3274248-34038-64598998840451=/root/.ansible/tmp/ansible-tmp-1726882660.3274248-34038-64598998840451 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34006 1726882660.36473: variable 'ansible_module_compression' from source: unknown 34006 1726882660.36475: ANSIBALLZ: Using generic lock for ansible.legacy.setup 34006 1726882660.36477: ANSIBALLZ: Acquiring lock 34006 1726882660.36800: ANSIBALLZ: Lock acquired: 139800307245776 34006 1726882660.36804: ANSIBALLZ: Creating module 34006 1726882660.78097: ANSIBALLZ: Writing module into payload 34006 1726882660.78471: ANSIBALLZ: Writing module 34006 1726882660.78537: ANSIBALLZ: Renaming module 34006 1726882660.78631: ANSIBALLZ: Done creating module 34006 1726882660.78676: variable 'ansible_facts' from source: unknown 34006 1726882660.78687: variable 'inventory_hostname' from source: host vars for 'managed_node3' 34006 1726882660.78704: _low_level_execute_command(): starting 34006 1726882660.78740: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 34006 1726882660.79952: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34006 1726882660.79965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882660.80062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882660.80146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 34006 1726882660.80261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34006 1726882660.80285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882660.80370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34006 1726882660.82217: stdout chunk (state=3): >>>PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 34006 1726882660.82335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34006 1726882660.82343: stdout chunk (state=3): >>><<< 34006 1726882660.82352: stderr chunk (state=3): >>><<< 34006 1726882660.82367: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34006 1726882660.82380 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 34006 1726882660.82439: _low_level_execute_command(): starting 34006 1726882660.82670: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 34006 1726882660.82851: Sending initial data 34006 1726882660.82860: Sent initial data (1181 bytes) 34006 1726882660.83972: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34006 1726882660.83988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34006 1726882660.84004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882660.84160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882660.84228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 34006 1726882660.84311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882660.84397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34006 1726882660.88136: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 34006 1726882660.88164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34006 1726882660.88272: stderr chunk (state=3): >>><<< 34006 1726882660.88275: stdout chunk (state=3): >>><<< 34006 1726882660.88343: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34006 1726882660.88675: variable 'ansible_facts' from source: unknown 34006 1726882660.88678: variable 'ansible_facts' from source: unknown 34006 1726882660.88681: variable 'ansible_module_compression' from source: unknown 34006 1726882660.88683: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-340060gu6lfii/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34006 1726882660.88685: variable 'ansible_facts' from source: unknown 34006 1726882660.89223: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882660.3274248-34038-64598998840451/AnsiballZ_setup.py 34006 1726882660.89502: Sending initial data 34006 1726882660.89505: Sent initial data (153 bytes) 34006 1726882660.90750: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 34006 1726882660.90844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 34006 1726882660.90962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882660.91200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34006 1726882660.92670: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34006 1726882660.92712: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34006 1726882660.92778: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-340060gu6lfii/tmpyj6ft8nb /root/.ansible/tmp/ansible-tmp-1726882660.3274248-34038-64598998840451/AnsiballZ_setup.py <<< 34006 1726882660.92788: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882660.3274248-34038-64598998840451/AnsiballZ_setup.py" <<< 34006 1726882660.92857: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-340060gu6lfii/tmpyj6ft8nb" to remote "/root/.ansible/tmp/ansible-tmp-1726882660.3274248-34038-64598998840451/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882660.3274248-34038-64598998840451/AnsiballZ_setup.py" <<< 34006 1726882660.95577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34006 1726882660.95641: stderr chunk (state=3): >>><<< 34006 1726882660.95648: stdout chunk (state=3): >>><<< 34006 1726882660.95841: done transferring module to remote 34006 1726882660.95847: _low_level_execute_command(): starting 34006 1726882660.95850: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882660.3274248-34038-64598998840451/ /root/.ansible/tmp/ansible-tmp-1726882660.3274248-34038-64598998840451/AnsiballZ_setup.py && sleep 0' 34006 1726882660.97045: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34006 1726882660.97396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 34006 1726882660.97445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882660.97498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34006 1726882660.99301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34006 1726882660.99310: stdout chunk (state=3): >>><<< 34006 1726882660.99321: stderr chunk (state=3): >>><<< 34006 1726882660.99344: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34006 1726882660.99352: _low_level_execute_command(): starting 34006 1726882660.99360: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882660.3274248-34038-64598998840451/AnsiballZ_setup.py && sleep 0' 34006 1726882661.00526: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34006 1726882661.00529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 34006 1726882661.00538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882661.00602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 34006 1726882661.00605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34006 1726882661.00865: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882661.00941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34006 1726882661.03096: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34006 1726882661.03157: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 34006 1726882661.03221: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 34006 1726882661.03380: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 34006 1726882661.03399: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 34006 1726882661.03427: stdout chunk (state=3): >>>import 'codecs' # <<< 34006 1726882661.03465: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 34006 1726882661.03498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424f104d0> <<< 34006 1726882661.03533: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424edfb30> <<< 34006 1726882661.03545: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424f12a50> <<< 34006 1726882661.03623: stdout chunk (state=3): >>>import '_signal' # <<< 34006 1726882661.03648: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 34006 1726882661.03734: stdout chunk (state=3): >>>import '_collections_abc' # <<< 34006 1726882661.03824: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 34006 1726882661.03829: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 34006 1726882661.03920: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 34006 1726882661.03924: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424cc1130> <<< 34006 1726882661.03961: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 34006 1726882661.03983: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424cc1fa0> <<< 34006 1726882661.04135: stdout chunk (state=3): >>>import 'site' # <<< 34006 1726882661.04139: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34006 1726882661.04416: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 34006 1726882661.04446: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 34006 1726882661.04474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882661.04484: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 34006 1726882661.04524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 34006 1726882661.04543: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 34006 1726882661.04575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 34006 1726882661.04578: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424cffdd0> <<< 34006 1726882661.04605: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 34006 1726882661.04688: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424cfffe0> <<< 34006 1726882661.04715: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 34006 1726882661.04718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 34006 1726882661.04720: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 34006 1726882661.04802: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882661.05008: stdout chunk (state=3): >>>import 'itertools' # <<< 34006 1726882661.05011: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 34006 1726882661.05013: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d37800> <<< 34006 1726882661.05040: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d37e90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d17aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d151c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424cfcf80> <<< 34006 1726882661.05071: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 34006 1726882661.05088: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 34006 1726882661.05115: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 34006 1726882661.05152: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 34006 1726882661.05174: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 34006 1726882661.05221: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d576b0> <<< 34006 1726882661.05225: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d562d0> <<< 34006 1726882661.05255: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d16090> <<< 34006 1726882661.05275: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424cfee70> <<< 34006 1726882661.05312: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 34006 1726882661.05335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d8c6e0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424cfc200> <<< 34006 1726882661.05499: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 34006 1726882661.05502: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424d8cb90> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d8ca40> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424d8ce30> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424cfad20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 34006 1726882661.05535: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 34006 1726882661.05556: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d8d520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d8d1f0> import 'importlib.machinery' # <<< 34006 1726882661.05595: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 34006 1726882661.05623: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d8e420> <<< 34006 1726882661.05636: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 34006 1726882661.05662: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 34006 1726882661.05697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 34006 1726882661.05721: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424da4620> <<< 34006 1726882661.05736: stdout chunk (state=3): >>>import 'errno' # <<< 34006 1726882661.05766: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882661.05795: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424da5d00> <<< 34006 1726882661.05924: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424da6ba0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424da7200> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424da60f0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 34006 1726882661.05944: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 34006 1726882661.05966: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882661.05992: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424da7c50> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424da73b0> <<< 34006 1726882661.06042: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d8e390> <<< 34006 1726882661.06059: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 34006 1726882661.06130: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 34006 1726882661.06133: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 34006 1726882661.06156: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424aafb60> <<< 34006 1726882661.06261: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 34006 1726882661.06265: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424ad85f0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424ad8350> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424ad8620> <<< 34006 1726882661.06283: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 34006 1726882661.06352: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882661.06562: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424ad8f50> <<< 34006 1726882661.06587: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882661.06611: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424ad98e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424ad8800> <<< 34006 1726882661.06627: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424aadd30> <<< 34006 1726882661.06681: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 34006 1726882661.06801: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424adac90> <<< 34006 1726882661.06804: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424ad9790> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d8eb40> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34006 1726882661.06838: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882661.06859: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 34006 1726882661.06889: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 34006 1726882661.06913: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424b06ff0> <<< 34006 1726882661.07123: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424b2b380> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34006 1726882661.07185: stdout chunk (state=3): >>>import 'ntpath' # <<< 34006 1726882661.07216: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424b88140> <<< 34006 1726882661.07232: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 34006 1726882661.07258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 34006 1726882661.07278: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 34006 1726882661.07324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 34006 1726882661.07406: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424b8a8a0> <<< 34006 1726882661.07481: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424b88260> <<< 34006 1726882661.07512: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424b55160> <<< 34006 1726882661.07549: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424999220> <<< 34006 1726882661.07572: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424b2a180> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424adbbc0> <<< 34006 1726882661.07773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3424b2a780> <<< 34006 1726882661.08009: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_ls7zox6x/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 34006 1726882661.08129: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.08213: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34006 1726882661.08272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 34006 1726882661.08305: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34249faf60> <<< 34006 1726882661.08419: stdout chunk (state=3): >>>import '_typing' # <<< 34006 1726882661.08538: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34249d9e50> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34249d8fe0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available <<< 34006 1726882661.08562: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34006 1726882661.08589: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 34006 1726882661.10027: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.11079: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 34006 1726882661.11212: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34249f8e30> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424a32960> <<< 34006 1726882661.11238: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424a326f0> <<< 34006 1726882661.11271: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424a32000> <<< 34006 1726882661.11340: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424a32ae0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424ad8440> import 'atexit' # <<< 34006 1726882661.11374: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 34006 1726882661.11443: stdout chunk (state=3): >>> # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424a33680> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424a338c0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 34006 1726882661.11467: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 34006 1726882661.11478: stdout chunk (state=3): >>>import '_locale' # <<< 34006 1726882661.11559: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424a33d40> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 34006 1726882661.11662: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424329a60> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f342432b710> <<< 34006 1726882661.11774: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342432bfb0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342432d220> <<< 34006 1726882661.11785: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 34006 1726882661.11897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 34006 1726882661.11998: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342432fce0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34249db050> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342432dfa0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 34006 1726882661.12019: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 34006 1726882661.12030: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 34006 1726882661.12212: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424337a70> import '_tokenize' # <<< 34006 1726882661.12253: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424336540> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34243362d0> <<< 34006 1726882661.12327: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 34006 1726882661.12351: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424336810> <<< 34006 1726882661.12441: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342432e4b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f342437bc20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342437bec0> <<< 34006 1726882661.12457: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 34006 1726882661.12474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 34006 1726882661.12494: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 34006 1726882661.12530: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882661.12549: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f342437d820> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342437d5e0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34006 1726882661.12622: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 34006 1726882661.12626: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882661.12638: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f342437fda0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342437df10> <<< 34006 1726882661.12763: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34006 1726882661.12786: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424383530> <<< 34006 1726882661.12892: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342437fec0> <<< 34006 1726882661.12985: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424384320> <<< 34006 1726882661.13096: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424384560> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34243848c0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342437b650> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 34006 1726882661.13154: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424387fe0> <<< 34006 1726882661.13310: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34242113d0> <<< 34006 1726882661.13319: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424386780> <<< 34006 1726882661.13344: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424387b30> <<< 34006 1726882661.13364: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34243863c0> # zipimport: zlib available <<< 34006 1726882661.13409: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 34006 1726882661.13413: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.13484: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.13571: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.13626: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 34006 1726882661.13629: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.13707: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 34006 1726882661.13745: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.13953: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.14532: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.15005: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 34006 1726882661.15008: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 34006 1726882661.15010: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 34006 1726882661.15012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882661.15066: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34242155e0> <<< 34006 1726882661.15098: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 34006 1726882661.15112: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424216360> <<< 34006 1726882661.15127: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34242115e0> <<< 34006 1726882661.15173: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 34006 1726882661.15204: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34006 1726882661.15216: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 34006 1726882661.15227: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.15443: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.15772: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424216450> # zipimport: zlib available <<< 34006 1726882661.16102: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.16490: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.16569: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.16644: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 34006 1726882661.16650: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.16701: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.16723: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 34006 1726882661.16746: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.16907: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.16911: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 34006 1726882661.16934: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 34006 1726882661.17107: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 34006 1726882661.17307: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.17605: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424217590> # zipimport: zlib available <<< 34006 1726882661.17683: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.17756: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 34006 1726882661.17760: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 34006 1726882661.17781: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 34006 1726882661.17787: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.17833: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.17872: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 34006 1726882661.17875: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.17923: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.18198: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34006 1726882661.18201: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 34006 1726882661.18226: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424221ee0> <<< 34006 1726882661.18243: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342421cfe0> <<< 34006 1726882661.18274: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 34006 1726882661.18280: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.18839: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342430a930> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34243fe600> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424222060> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424221ca0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 34006 1726882661.18843: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34006 1726882661.18911: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 34006 1726882661.18914: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 34006 1726882661.18947: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 34006 1726882661.18957: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.18960: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.18962: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 34006 1726882661.18998: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.19307: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 34006 1726882661.19356: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.19436: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.19508: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 34006 1726882661.19708: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.19832: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.19873: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.19924: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 34006 1726882661.19931: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882661.19956: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 34006 1726882661.19969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 34006 1726882661.19975: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 34006 1726882661.20016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 34006 1726882661.20023: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34242b60c0> <<< 34006 1726882661.20053: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 34006 1726882661.20074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 34006 1726882661.20080: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py<<< 34006 1726882661.20309: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423ee3f50> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3423ee82f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342429ea20> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34242b6c30> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34242b47a0> <<< 34006 1726882661.20315: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34242b43e0> <<< 34006 1726882661.20337: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 34006 1726882661.20398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 34006 1726882661.20405: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 34006 1726882661.20411: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 34006 1726882661.20606: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3423eeb350> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423eeac00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3423eeade0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423eea030> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 34006 1726882661.20842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423eeb500> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 34006 1726882661.20847: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3423f4e030> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423eebf50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34242b5880> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available <<< 34006 1726882661.20849: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # <<< 34006 1726882661.20898: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.20921: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.20971: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 34006 1726882661.20992: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.21049: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.21087: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 34006 1726882661.21104: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.21108: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 34006 1726882661.21129: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.21159: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.21239: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 34006 1726882661.21242: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.21312: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 34006 1726882661.21349: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.21511: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available <<< 34006 1726882661.21706: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 34006 1726882661.22116: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.22570: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 34006 1726882661.22573: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.22606: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.22705: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34006 1726882661.22724: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 34006 1726882661.22727: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 34006 1726882661.22789: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.22812: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 34006 1726882661.23015: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 34006 1726882661.23018: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.23050: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 34006 1726882661.23055: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.23139: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.23232: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 34006 1726882661.23238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 34006 1726882661.23421: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423f4f2f0> <<< 34006 1726882661.23424: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423f4edb0> import 'ansible.module_utils.facts.system.local' # <<< 34006 1726882661.23427: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.23497: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.23558: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 34006 1726882661.23564: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.23909: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 34006 1726882661.23937: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.23982: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 34006 1726882661.24097: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882661.24160: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3423f86360> <<< 34006 1726882661.24349: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423f76060> <<< 34006 1726882661.24355: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 34006 1726882661.24421: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.24471: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 34006 1726882661.24517: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.24563: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.24641: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.24906: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 34006 1726882661.24951: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.24982: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 34006 1726882661.25001: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.25031: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.25078: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 34006 1726882661.25084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 34006 1726882661.25413: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3423f99dc0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423f86150> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 34006 1726882661.25421: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.25572: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 34006 1726882661.25599: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.25679: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.25778: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.25815: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.25998: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 34006 1726882661.26206: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 34006 1726882661.26319: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.26606: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34006 1726882661.27072: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.27715: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available <<< 34006 1726882661.27784: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 34006 1726882661.27794: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.27901: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.27987: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 34006 1726882661.28286: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.28304: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 34006 1726882661.28310: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34006 1726882661.28406: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 34006 1726882661.28411: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.28415: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 34006 1726882661.28701: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.28737: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34006 1726882661.28814: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.29124: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 34006 1726882661.29132: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.29210: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 34006 1726882661.29230: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.29294: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 34006 1726882661.29316: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.29332: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.29372: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 34006 1726882661.29414: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.29484: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 34006 1726882661.29539: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.29919: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available <<< 34006 1726882661.30109: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 34006 1726882661.30175: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.30232: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 34006 1726882661.30391: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available <<< 34006 1726882661.30396: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 34006 1726882661.30400: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.30430: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.30457: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 34006 1726882661.30484: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.30712: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 34006 1726882661.30835: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34006 1726882661.30892: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.30954: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.31023: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 34006 1726882661.31044: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 34006 1726882661.31097: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.31157: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 34006 1726882661.31339: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.31528: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 34006 1726882661.31546: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.31592: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.31630: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 34006 1726882661.31711: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34006 1726882661.31813: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 34006 1726882661.31915: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 34006 1726882661.31997: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.32130: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 34006 1726882661.32159: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882661.32811: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3423d962a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423d955b0> <<< 34006 1726882661.33008: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423d8fef0> <<< 34006 1726882661.46479: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 34006 1726882661.46486: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423ddc140> <<< 34006 1726882661.46642: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 34006 1726882661.46647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423ddcfb0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882661.46650: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 34006 1726882661.46656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423dde2d0> <<< 34006 1726882661.46771: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423dddd90> <<< 34006 1726882661.47000: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 34006 1726882661.71658: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.6494140625, "5m": 0.646484375, "15m": 0.400390625}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED"<<< 34006 1726882661.71742: stdout chunk (state=3): >>>: "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "41", "epoch": "1726882661", "epoch_int": "1726882661", "date": "2024-09-20", "time": "21:37:41", "iso8601_micro": "2024-09-21T01:37:41.336632Z", "iso8601": "2024-09-21T01:37:41Z", "iso8601_basic": "20240920T213741336632", "iso8601_basic_short": "20240920T213741", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2936, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 595, "free": 2936}, "nocache": {"free": 3273, "used": 258}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_facto<<< 34006 1726882661.71746: stdout chunk (state=3): >>>r": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 968, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261779779584, "block_size": 4096, "block_total": 65519099, "block_available": 63911079, "block_used": 1608020, "inode_total": 131070960, "inode_available": 131028993, "inode_used": 41967, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "<<< 34006 1726882661.71771: stdout chunk (state=3): >>>off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_fips": false, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34006 1726882661.72322: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 34006 1726882661.72353: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path <<< 34006 1726882661.72357: stdout chunk (state=3): >>># restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal <<< 34006 1726882661.72371: stdout chunk (state=3): >>># cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections <<< 34006 1726882661.72499: stdout chunk (state=3): >>># cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random <<< 34006 1726882661.72523: stdout chunk (state=3): >>># cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 34006 1726882661.72542: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd <<< 34006 1726882661.72573: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux <<< 34006 1726882661.72594: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base <<< 34006 1726882661.72627: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 34006 1726882661.72974: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 <<< 34006 1726882661.72977: stdout chunk (state=3): >>># destroy _compression # destroy _lzma # destroy _blake2 <<< 34006 1726882661.73013: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 34006 1726882661.73059: stdout chunk (state=3): >>># destroy ntpath <<< 34006 1726882661.73081: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder <<< 34006 1726882661.73180: stdout chunk (state=3): >>># destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 34006 1726882661.73232: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 34006 1726882661.73304: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction <<< 34006 1726882661.73314: stdout chunk (state=3): >>># destroy selectors # destroy shlex<<< 34006 1726882661.73356: stdout chunk (state=3): >>> # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 34006 1726882661.73446: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 34006 1726882661.73598: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal <<< 34006 1726882661.73602: stdout chunk (state=3): >>># cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 34006 1726882661.73820: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34006 1726882661.73823: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 34006 1726882661.73826: stdout chunk (state=3): >>># destroy _collections <<< 34006 1726882661.73846: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 34006 1726882661.73873: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 34006 1726882661.73905: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34006 1726882661.74135: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 34006 1726882661.74160: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 34006 1726882661.74470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 34006 1726882661.74505: stderr chunk (state=3): >>><<< 34006 1726882661.74585: stdout chunk (state=3): >>><<< 34006 1726882661.74910: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424f104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424edfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424f12a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424cc1130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424cc1fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424cffdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424cfffe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d37800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d37e90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d17aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d151c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424cfcf80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d576b0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d562d0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d16090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424cfee70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d8c6e0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424cfc200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424d8cb90> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d8ca40> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424d8ce30> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424cfad20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d8d520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d8d1f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d8e420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424da4620> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424da5d00> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424da6ba0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424da7200> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424da60f0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424da7c50> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424da73b0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d8e390> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424aafb60> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424ad85f0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424ad8350> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424ad8620> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424ad8f50> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424ad98e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424ad8800> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424aadd30> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424adac90> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424ad9790> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424d8eb40> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424b06ff0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424b2b380> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424b88140> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424b8a8a0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424b88260> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424b55160> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424999220> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424b2a180> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424adbbc0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3424b2a780> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_ls7zox6x/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34249faf60> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34249d9e50> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34249d8fe0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34249f8e30> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424a32960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424a326f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424a32000> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424a32ae0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424ad8440> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424a33680> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424a338c0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424a33d40> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424329a60> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f342432b710> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342432bfb0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342432d220> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342432fce0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34249db050> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342432dfa0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424337a70> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424336540> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34243362d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424336810> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342432e4b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f342437bc20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342437bec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f342437d820> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342437d5e0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f342437fda0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342437df10> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424383530> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342437fec0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424384320> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424384560> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34243848c0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342437b650> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424387fe0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34242113d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424386780> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424387b30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34243863c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34242155e0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424216360> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34242115e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424216450> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424217590> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3424221ee0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342421cfe0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342430a930> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34243fe600> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424222060> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3424221ca0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34242b60c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423ee3f50> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3423ee82f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f342429ea20> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34242b6c30> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34242b47a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34242b43e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3423eeb350> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423eeac00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3423eeade0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423eea030> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423eeb500> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3423f4e030> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423eebf50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34242b5880> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423f4f2f0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423f4edb0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3423f86360> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423f76060> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3423f99dc0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423f86150> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3423d962a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423d955b0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423d8fef0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423ddc140> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423ddcfb0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423dde2d0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3423dddd90> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.6494140625, "5m": 0.646484375, "15m": 0.400390625}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "41", "epoch": "1726882661", "epoch_int": "1726882661", "date": "2024-09-20", "time": "21:37:41", "iso8601_micro": "2024-09-21T01:37:41.336632Z", "iso8601": "2024-09-21T01:37:41Z", "iso8601_basic": "20240920T213741336632", "iso8601_basic_short": "20240920T213741", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2936, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 595, "free": 2936}, "nocache": {"free": 3273, "used": 258}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 968, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261779779584, "block_size": 4096, "block_total": 65519099, "block_available": 63911079, "block_used": 1608020, "inode_total": 131070960, "inode_available": 131028993, "inode_used": 41967, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_fips": false, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 34006 1726882661.77550: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882660.3274248-34038-64598998840451/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34006 1726882661.77553: _low_level_execute_command(): starting 34006 1726882661.77556: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882660.3274248-34038-64598998840451/ > /dev/null 2>&1 && sleep 0' 34006 1726882661.78212: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 34006 1726882661.78362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34006 1726882661.78554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882661.78602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34006 1726882661.80492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34006 1726882661.80497: stdout chunk (state=3): >>><<< 34006 1726882661.80500: stderr chunk (state=3): >>><<< 34006 1726882661.80723: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34006 1726882661.80726: handler run complete 34006 1726882661.80771: variable 'ansible_facts' from source: unknown 34006 1726882661.81008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882661.81472: variable 'ansible_facts' from source: unknown 34006 1726882661.81564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882661.81680: attempt loop complete, returning result 34006 1726882661.81689: _execute() done 34006 1726882661.81700: dumping result to json 34006 1726882661.81735: done dumping result, returning 34006 1726882661.81747: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [12673a56-9f93-11ce-7734-000000000147] 34006 1726882661.81756: sending task result for task 12673a56-9f93-11ce-7734-000000000147 34006 1726882661.82338: done sending task result for task 12673a56-9f93-11ce-7734-000000000147 34006 1726882661.82341: WORKER PROCESS EXITING ok: [managed_node3] 34006 1726882661.82832: no more pending results, returning what we have 34006 1726882661.82836: results queue empty 34006 1726882661.82836: checking for any_errors_fatal 34006 1726882661.82838: done checking for any_errors_fatal 34006 1726882661.82838: checking for max_fail_percentage 34006 1726882661.82840: done checking for max_fail_percentage 34006 1726882661.82840: checking to see if all hosts have failed and the running result is not ok 34006 1726882661.82841: done checking to see if all hosts have failed 34006 1726882661.82842: getting the remaining hosts for this loop 34006 1726882661.82844: done getting the remaining hosts for this loop 34006 1726882661.82847: getting the next task for host managed_node3 34006 1726882661.82852: done getting next task for host managed_node3 34006 1726882661.82854: ^ task is: TASK: meta (flush_handlers) 34006 1726882661.82856: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882661.82860: getting variables 34006 1726882661.82861: in VariableManager get_vars() 34006 1726882661.82881: Calling all_inventory to load vars for managed_node3 34006 1726882661.82884: Calling groups_inventory to load vars for managed_node3 34006 1726882661.82887: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882661.82901: Calling all_plugins_play to load vars for managed_node3 34006 1726882661.82904: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882661.82907: Calling groups_plugins_play to load vars for managed_node3 34006 1726882661.83131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882661.83356: done with get_vars() 34006 1726882661.83369: done getting variables 34006 1726882661.83442: in VariableManager get_vars() 34006 1726882661.83452: Calling all_inventory to load vars for managed_node3 34006 1726882661.83454: Calling groups_inventory to load vars for managed_node3 34006 1726882661.83457: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882661.83463: Calling all_plugins_play to load vars for managed_node3 34006 1726882661.83466: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882661.83468: Calling groups_plugins_play to load vars for managed_node3 34006 1726882661.84013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882661.84412: done with get_vars() 34006 1726882661.84425: done queuing things up, now waiting for results queue to drain 34006 1726882661.84427: results queue empty 34006 1726882661.84428: checking for any_errors_fatal 34006 1726882661.84430: done checking for any_errors_fatal 34006 1726882661.84431: checking for max_fail_percentage 34006 1726882661.84432: done checking for max_fail_percentage 34006 1726882661.84432: checking to see if all hosts have failed and the running result is not ok 34006 1726882661.84433: done checking to see if all hosts have failed 34006 1726882661.84438: getting the remaining hosts for this loop 34006 1726882661.84439: done getting the remaining hosts for this loop 34006 1726882661.84441: getting the next task for host managed_node3 34006 1726882661.84445: done getting next task for host managed_node3 34006 1726882661.84448: ^ task is: TASK: Include the task 'el_repo_setup.yml' 34006 1726882661.84449: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882661.84451: getting variables 34006 1726882661.84452: in VariableManager get_vars() 34006 1726882661.84460: Calling all_inventory to load vars for managed_node3 34006 1726882661.84462: Calling groups_inventory to load vars for managed_node3 34006 1726882661.84464: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882661.84469: Calling all_plugins_play to load vars for managed_node3 34006 1726882661.84471: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882661.84473: Calling groups_plugins_play to load vars for managed_node3 34006 1726882661.84938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882661.85176: done with get_vars() 34006 1726882661.85186: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:11 Friday 20 September 2024 21:37:41 -0400 (0:00:01.602) 0:00:01.629 ****** 34006 1726882661.85285: entering _queue_task() for managed_node3/include_tasks 34006 1726882661.85287: Creating lock for include_tasks 34006 1726882661.85575: worker is 1 (out of 1 available) 34006 1726882661.85586: exiting _queue_task() for managed_node3/include_tasks 34006 1726882661.85701: done queuing things up, now waiting for results queue to drain 34006 1726882661.85703: waiting for pending results... 34006 1726882661.86013: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 34006 1726882661.86020: in run() - task 12673a56-9f93-11ce-7734-000000000006 34006 1726882661.86023: variable 'ansible_search_path' from source: unknown 34006 1726882661.86026: calling self._execute() 34006 1726882661.86141: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882661.86145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882661.86148: variable 'omit' from source: magic vars 34006 1726882661.86178: _execute() done 34006 1726882661.86186: dumping result to json 34006 1726882661.86195: done dumping result, returning 34006 1726882661.86207: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [12673a56-9f93-11ce-7734-000000000006] 34006 1726882661.86217: sending task result for task 12673a56-9f93-11ce-7734-000000000006 34006 1726882661.86391: no more pending results, returning what we have 34006 1726882661.86398: in VariableManager get_vars() 34006 1726882661.86431: Calling all_inventory to load vars for managed_node3 34006 1726882661.86435: Calling groups_inventory to load vars for managed_node3 34006 1726882661.86438: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882661.86451: Calling all_plugins_play to load vars for managed_node3 34006 1726882661.86454: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882661.86457: Calling groups_plugins_play to load vars for managed_node3 34006 1726882661.86859: done sending task result for task 12673a56-9f93-11ce-7734-000000000006 34006 1726882661.86865: WORKER PROCESS EXITING 34006 1726882661.87015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882661.87510: done with get_vars() 34006 1726882661.87517: variable 'ansible_search_path' from source: unknown 34006 1726882661.87530: we have included files to process 34006 1726882661.87531: generating all_blocks data 34006 1726882661.87532: done generating all_blocks data 34006 1726882661.87533: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34006 1726882661.87534: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34006 1726882661.87536: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34006 1726882661.88789: in VariableManager get_vars() 34006 1726882661.88926: done with get_vars() 34006 1726882661.88938: done processing included file 34006 1726882661.88940: iterating over new_blocks loaded from include file 34006 1726882661.88942: in VariableManager get_vars() 34006 1726882661.88951: done with get_vars() 34006 1726882661.88953: filtering new block on tags 34006 1726882661.88968: done filtering new block on tags 34006 1726882661.88971: in VariableManager get_vars() 34006 1726882661.88981: done with get_vars() 34006 1726882661.88982: filtering new block on tags 34006 1726882661.88999: done filtering new block on tags 34006 1726882661.89002: in VariableManager get_vars() 34006 1726882661.89012: done with get_vars() 34006 1726882661.89013: filtering new block on tags 34006 1726882661.89025: done filtering new block on tags 34006 1726882661.89027: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 34006 1726882661.89032: extending task lists for all hosts with included blocks 34006 1726882661.89076: done extending task lists 34006 1726882661.89077: done processing included files 34006 1726882661.89078: results queue empty 34006 1726882661.89079: checking for any_errors_fatal 34006 1726882661.89080: done checking for any_errors_fatal 34006 1726882661.89081: checking for max_fail_percentage 34006 1726882661.89082: done checking for max_fail_percentage 34006 1726882661.89082: checking to see if all hosts have failed and the running result is not ok 34006 1726882661.89083: done checking to see if all hosts have failed 34006 1726882661.89083: getting the remaining hosts for this loop 34006 1726882661.89085: done getting the remaining hosts for this loop 34006 1726882661.89087: getting the next task for host managed_node3 34006 1726882661.89090: done getting next task for host managed_node3 34006 1726882661.89279: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 34006 1726882661.89282: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882661.89285: getting variables 34006 1726882661.89286: in VariableManager get_vars() 34006 1726882661.89296: Calling all_inventory to load vars for managed_node3 34006 1726882661.89298: Calling groups_inventory to load vars for managed_node3 34006 1726882661.89301: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882661.89305: Calling all_plugins_play to load vars for managed_node3 34006 1726882661.89308: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882661.89310: Calling groups_plugins_play to load vars for managed_node3 34006 1726882661.89469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882661.89666: done with get_vars() 34006 1726882661.89675: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:37:41 -0400 (0:00:00.044) 0:00:01.673 ****** 34006 1726882661.89744: entering _queue_task() for managed_node3/setup 34006 1726882661.90005: worker is 1 (out of 1 available) 34006 1726882661.90016: exiting _queue_task() for managed_node3/setup 34006 1726882661.90027: done queuing things up, now waiting for results queue to drain 34006 1726882661.90029: waiting for pending results... 34006 1726882661.90254: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 34006 1726882661.90353: in run() - task 12673a56-9f93-11ce-7734-000000000158 34006 1726882661.90374: variable 'ansible_search_path' from source: unknown 34006 1726882661.90381: variable 'ansible_search_path' from source: unknown 34006 1726882661.90419: calling self._execute() 34006 1726882661.90491: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882661.90505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882661.90519: variable 'omit' from source: magic vars 34006 1726882661.91389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34006 1726882661.93583: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34006 1726882661.93637: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34006 1726882661.93671: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34006 1726882661.93709: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34006 1726882661.93745: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34006 1726882661.93806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34006 1726882661.93827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34006 1726882661.93861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34006 1726882661.94002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34006 1726882661.94006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34006 1726882661.94078: variable 'ansible_facts' from source: unknown 34006 1726882661.94149: variable 'network_test_required_facts' from source: task vars 34006 1726882661.94191: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 34006 1726882661.94206: variable 'omit' from source: magic vars 34006 1726882661.94245: variable 'omit' from source: magic vars 34006 1726882661.94278: variable 'omit' from source: magic vars 34006 1726882661.94311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34006 1726882661.94340: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34006 1726882661.94362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34006 1726882661.94382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34006 1726882661.94399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34006 1726882661.94431: variable 'inventory_hostname' from source: host vars for 'managed_node3' 34006 1726882661.94439: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882661.94446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882661.94542: Set connection var ansible_pipelining to False 34006 1726882661.94553: Set connection var ansible_shell_executable to /bin/sh 34006 1726882661.94565: Set connection var ansible_timeout to 10 34006 1726882661.94578: Set connection var ansible_connection to ssh 34006 1726882661.94589: Set connection var ansible_module_compression to ZIP_DEFLATED 34006 1726882661.94600: Set connection var ansible_shell_type to sh 34006 1726882661.94698: variable 'ansible_shell_executable' from source: unknown 34006 1726882661.94701: variable 'ansible_connection' from source: unknown 34006 1726882661.94703: variable 'ansible_module_compression' from source: unknown 34006 1726882661.94705: variable 'ansible_shell_type' from source: unknown 34006 1726882661.94707: variable 'ansible_shell_executable' from source: unknown 34006 1726882661.94709: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882661.94710: variable 'ansible_pipelining' from source: unknown 34006 1726882661.94712: variable 'ansible_timeout' from source: unknown 34006 1726882661.94714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882661.94807: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34006 1726882661.94857: variable 'omit' from source: magic vars 34006 1726882661.94871: starting attempt loop 34006 1726882661.94877: running the handler 34006 1726882661.94880: _low_level_execute_command(): starting 34006 1726882661.94883: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34006 1726882661.95551: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 34006 1726882661.95572: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882661.95651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34006 1726882661.97843: stdout chunk (state=3): >>>/root <<< 34006 1726882661.98001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34006 1726882661.98009: stdout chunk (state=3): >>><<< 34006 1726882661.98018: stderr chunk (state=3): >>><<< 34006 1726882661.98033: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34006 1726882661.98050: _low_level_execute_command(): starting 34006 1726882661.98056: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882661.9804232-34103-64351694051563 `" && echo ansible-tmp-1726882661.9804232-34103-64351694051563="` echo /root/.ansible/tmp/ansible-tmp-1726882661.9804232-34103-64351694051563 `" ) && sleep 0' 34006 1726882661.98589: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882661.98616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882661.98620: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882661.98622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882661.98676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 34006 1726882661.98679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882661.98739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34006 1726882662.01070: stdout chunk (state=3): >>>ansible-tmp-1726882661.9804232-34103-64351694051563=/root/.ansible/tmp/ansible-tmp-1726882661.9804232-34103-64351694051563 <<< 34006 1726882662.01239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34006 1726882662.01260: stderr chunk (state=3): >>><<< 34006 1726882662.01263: stdout chunk (state=3): >>><<< 34006 1726882662.01277: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882661.9804232-34103-64351694051563=/root/.ansible/tmp/ansible-tmp-1726882661.9804232-34103-64351694051563 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34006 1726882662.01315: variable 'ansible_module_compression' from source: unknown 34006 1726882662.01355: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-340060gu6lfii/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34006 1726882662.01397: variable 'ansible_facts' from source: unknown 34006 1726882662.01524: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882661.9804232-34103-64351694051563/AnsiballZ_setup.py 34006 1726882662.01619: Sending initial data 34006 1726882662.01622: Sent initial data (153 bytes) 34006 1726882662.02034: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34006 1726882662.02037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 34006 1726882662.02039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882662.02042: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882662.02044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882662.02103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 34006 1726882662.02107: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882662.02145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34006 1726882662.03975: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34006 1726882662.04022: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34006 1726882662.04074: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-340060gu6lfii/tmpl8opzm3l /root/.ansible/tmp/ansible-tmp-1726882661.9804232-34103-64351694051563/AnsiballZ_setup.py <<< 34006 1726882662.04078: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882661.9804232-34103-64351694051563/AnsiballZ_setup.py" <<< 34006 1726882662.04120: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-340060gu6lfii/tmpl8opzm3l" to remote "/root/.ansible/tmp/ansible-tmp-1726882661.9804232-34103-64351694051563/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882661.9804232-34103-64351694051563/AnsiballZ_setup.py" <<< 34006 1726882662.05230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34006 1726882662.05271: stderr chunk (state=3): >>><<< 34006 1726882662.05274: stdout chunk (state=3): >>><<< 34006 1726882662.05290: done transferring module to remote 34006 1726882662.05306: _low_level_execute_command(): starting 34006 1726882662.05308: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882661.9804232-34103-64351694051563/ /root/.ansible/tmp/ansible-tmp-1726882661.9804232-34103-64351694051563/AnsiballZ_setup.py && sleep 0' 34006 1726882662.05735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882662.05739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882662.05742: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882662.05744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882662.05824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882662.05863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34006 1726882662.08059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34006 1726882662.08080: stderr chunk (state=3): >>><<< 34006 1726882662.08083: stdout chunk (state=3): >>><<< 34006 1726882662.08102: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34006 1726882662.08105: _low_level_execute_command(): starting 34006 1726882662.08115: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882661.9804232-34103-64351694051563/AnsiballZ_setup.py && sleep 0' 34006 1726882662.08517: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34006 1726882662.08522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882662.08524: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882662.08526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882662.08575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 34006 1726882662.08578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882662.08636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34006 1726882662.11753: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34006 1726882662.11812: stdout chunk (state=3): >>>import _imp # builtin<<< 34006 1726882662.11853: stdout chunk (state=3): >>> import '_thread' # <<< 34006 1726882662.11899: stdout chunk (state=3): >>> import '_warnings' # import '_weakref' # <<< 34006 1726882662.12286: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 34006 1726882662.12294: stdout chunk (state=3): >>> import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # <<< 34006 1726882662.12298: stdout chunk (state=3): >>> # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py<<< 34006 1726882662.12302: stdout chunk (state=3): >>> <<< 34006 1726882662.12305: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc'<<< 34006 1726882662.12307: stdout chunk (state=3): >>> <<< 34006 1726882662.12326: stdout chunk (state=3): >>>import '_codecs' # <<< 34006 1726882662.12416: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py<<< 34006 1726882662.12433: stdout chunk (state=3): >>> <<< 34006 1726882662.12465: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 34006 1726882662.12471: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21caa184d0><<< 34006 1726882662.12497: stdout chunk (state=3): >>> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca9e7b30><<< 34006 1726882662.12518: stdout chunk (state=3): >>> <<< 34006 1726882662.12642: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc'<<< 34006 1726882662.12676: stdout chunk (state=3): >>> import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21caa1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # <<< 34006 1726882662.12698: stdout chunk (state=3): >>>import 'stat' # <<< 34006 1726882662.12840: stdout chunk (state=3): >>>import '_collections_abc' # <<< 34006 1726882662.12843: stdout chunk (state=3): >>> <<< 34006 1726882662.12902: stdout chunk (state=3): >>>import 'genericpath' # <<< 34006 1726882662.12905: stdout chunk (state=3): >>> <<< 34006 1726882662.12908: stdout chunk (state=3): >>>import 'posixpath' # <<< 34006 1726882662.12917: stdout chunk (state=3): >>> <<< 34006 1726882662.12965: stdout chunk (state=3): >>>import 'os' # <<< 34006 1726882662.12997: stdout chunk (state=3): >>> import '_sitebuiltins' # <<< 34006 1726882662.13030: stdout chunk (state=3): >>> Processing user site-packages <<< 34006 1726882662.13063: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages'<<< 34006 1726882662.13082: stdout chunk (state=3): >>> Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth'<<< 34006 1726882662.13130: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 34006 1726882662.13152: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc'<<< 34006 1726882662.13189: stdout chunk (state=3): >>> import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca82d130> <<< 34006 1726882662.13266: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py<<< 34006 1726882662.13297: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc'<<< 34006 1726882662.13322: stdout chunk (state=3): >>> import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca82dfa0><<< 34006 1726882662.13340: stdout chunk (state=3): >>> <<< 34006 1726882662.13364: stdout chunk (state=3): >>>import 'site' # <<< 34006 1726882662.13425: stdout chunk (state=3): >>> Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux<<< 34006 1726882662.13435: stdout chunk (state=3): >>> Type "help", "copyright", "credits" or "license" for more information.<<< 34006 1726882662.13540: stdout chunk (state=3): >>> <<< 34006 1726882662.14078: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 34006 1726882662.14111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 34006 1726882662.14114: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 34006 1726882662.14361: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca86be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca86bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 34006 1726882662.14461: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882662.14503: stdout chunk (state=3): >>>import 'itertools' # <<< 34006 1726882662.14552: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 34006 1726882662.14555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc'<<< 34006 1726882662.14558: stdout chunk (state=3): >>> <<< 34006 1726882662.14606: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py<<< 34006 1726882662.14611: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 34006 1726882662.14654: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8a3ec0> <<< 34006 1726882662.14657: stdout chunk (state=3): >>>import '_collections' # <<< 34006 1726882662.14741: stdout chunk (state=3): >>> import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca883b60> <<< 34006 1726882662.14753: stdout chunk (state=3): >>>import '_functools' # <<< 34006 1726882662.14804: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca881280> <<< 34006 1726882662.14946: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca869040> <<< 34006 1726882662.14989: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py<<< 34006 1726882662.15018: stdout chunk (state=3): >>> <<< 34006 1726882662.15021: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc'<<< 34006 1726882662.15042: stdout chunk (state=3): >>> import '_sre' # <<< 34006 1726882662.15076: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py<<< 34006 1726882662.15117: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 34006 1726882662.15149: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 34006 1726882662.15222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8c3740> <<< 34006 1726882662.15246: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8c2360><<< 34006 1726882662.15273: stdout chunk (state=3): >>> <<< 34006 1726882662.15297: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 34006 1726882662.15313: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca882120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8c0b90><<< 34006 1726882662.15384: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 34006 1726882662.15430: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8f87a0> <<< 34006 1726882662.15449: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8682c0> <<< 34006 1726882662.15474: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 34006 1726882662.15533: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.15561: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.15571: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca8f8c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8f8b00><<< 34006 1726882662.15615: stdout chunk (state=3): >>> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.15663: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca8f8ec0> <<< 34006 1726882662.15668: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca866de0><<< 34006 1726882662.15716: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 34006 1726882662.15728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882662.15751: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py<<< 34006 1726882662.15805: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 34006 1726882662.15840: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8f9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8f9250> <<< 34006 1726882662.15859: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 34006 1726882662.15912: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 34006 1726882662.15954: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8fa480><<< 34006 1726882662.15977: stdout chunk (state=3): >>> import 'importlib.util' # <<< 34006 1726882662.15998: stdout chunk (state=3): >>> <<< 34006 1726882662.16028: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py<<< 34006 1726882662.16040: stdout chunk (state=3): >>> <<< 34006 1726882662.16080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 34006 1726882662.16121: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py<<< 34006 1726882662.16145: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca9106b0><<< 34006 1726882662.16166: stdout chunk (state=3): >>> import 'errno' # <<< 34006 1726882662.16224: stdout chunk (state=3): >>> # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.16227: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.16358: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca911dc0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 34006 1726882662.16382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc'<<< 34006 1726882662.16421: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py<<< 34006 1726882662.16442: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc'<<< 34006 1726882662.16463: stdout chunk (state=3): >>> import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca912c30> <<< 34006 1726882662.16504: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so'<<< 34006 1726882662.16536: stdout chunk (state=3): >>> # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca913260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca912180> <<< 34006 1726882662.16576: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc'<<< 34006 1726882662.16630: stdout chunk (state=3): >>> # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.16663: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca913ce0> <<< 34006 1726882662.16690: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca913410><<< 34006 1726882662.16707: stdout chunk (state=3): >>> <<< 34006 1726882662.16779: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8fa4e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 34006 1726882662.16816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 34006 1726882662.16849: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 34006 1726882662.16894: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc'<<< 34006 1726882662.16904: stdout chunk (state=3): >>> <<< 34006 1726882662.16933: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 34006 1726882662.16972: stdout chunk (state=3): >>> import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca60fb90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 34006 1726882662.16988: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc'<<< 34006 1726882662.17047: stdout chunk (state=3): >>> # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.17053: stdout chunk (state=3): >>>import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca6386b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca638410><<< 34006 1726882662.17114: stdout chunk (state=3): >>> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.17117: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca6386e0><<< 34006 1726882662.17177: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 34006 1726882662.17181: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 34006 1726882662.17276: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.17457: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca638fe0><<< 34006 1726882662.17657: stdout chunk (state=3): >>> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.17687: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca639970> <<< 34006 1726882662.17705: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca638890> <<< 34006 1726882662.17734: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca60dd60> <<< 34006 1726882662.17758: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 34006 1726882662.17796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 34006 1726882662.17826: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 34006 1726882662.17853: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 34006 1726882662.17879: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca63ad20> <<< 34006 1726882662.17916: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca639850> <<< 34006 1726882662.17951: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8fabd0> <<< 34006 1726882662.18051: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34006 1726882662.18082: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882662.18112: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py<<< 34006 1726882662.18147: stdout chunk (state=3): >>> <<< 34006 1726882662.18178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 34006 1726882662.18216: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca667080><<< 34006 1726882662.18306: stdout chunk (state=3): >>> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 34006 1726882662.18328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882662.18382: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 34006 1726882662.18421: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 34006 1726882662.18472: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca68b470><<< 34006 1726882662.18503: stdout chunk (state=3): >>> <<< 34006 1726882662.18515: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py<<< 34006 1726882662.18569: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34006 1726882662.18658: stdout chunk (state=3): >>>import 'ntpath' # <<< 34006 1726882662.18668: stdout chunk (state=3): >>> <<< 34006 1726882662.18723: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc'<<< 34006 1726882662.18748: stdout chunk (state=3): >>> import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca6e8260> <<< 34006 1726882662.18761: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py<<< 34006 1726882662.18803: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc'<<< 34006 1726882662.18848: stdout chunk (state=3): >>> <<< 34006 1726882662.18852: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 34006 1726882662.18905: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 34006 1726882662.19037: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca6ea9c0><<< 34006 1726882662.19162: stdout chunk (state=3): >>> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca6e8380> <<< 34006 1726882662.19213: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca6b5250> <<< 34006 1726882662.19264: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 34006 1726882662.19314: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca4e9340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca68a270> <<< 34006 1726882662.19336: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca63bc50> <<< 34006 1726882662.19635: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc'<<< 34006 1726882662.19664: stdout chunk (state=3): >>> <<< 34006 1726882662.19678: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f21ca68a870> <<< 34006 1726882662.20026: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_op2f2sib/ansible_setup_payload.zip' <<< 34006 1726882662.20030: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.20139: stdout chunk (state=3): >>> <<< 34006 1726882662.20236: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.20297: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 34006 1726882662.20313: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc'<<< 34006 1726882662.20329: stdout chunk (state=3): >>> <<< 34006 1726882662.20360: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py<<< 34006 1726882662.20365: stdout chunk (state=3): >>> <<< 34006 1726882662.20475: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc'<<< 34006 1726882662.20479: stdout chunk (state=3): >>> <<< 34006 1726882662.20519: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py<<< 34006 1726882662.20525: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc'<<< 34006 1726882662.20536: stdout chunk (state=3): >>> <<< 34006 1726882662.20562: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca553080> <<< 34006 1726882662.20566: stdout chunk (state=3): >>>import '_typing' # <<< 34006 1726882662.20571: stdout chunk (state=3): >>> <<< 34006 1726882662.20847: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca531f70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca531100><<< 34006 1726882662.20868: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34006 1726882662.20903: stdout chunk (state=3): >>> import 'ansible' # <<< 34006 1726882662.20911: stdout chunk (state=3): >>> <<< 34006 1726882662.20928: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.20934: stdout chunk (state=3): >>> <<< 34006 1726882662.20954: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.20992: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 34006 1726882662.21015: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34006 1726882662.21021: stdout chunk (state=3): >>> <<< 34006 1726882662.23172: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.25048: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca550f20><<< 34006 1726882662.25092: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 34006 1726882662.25147: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 34006 1726882662.25169: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 34006 1726882662.25191: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc'<<< 34006 1726882662.25256: stdout chunk (state=3): >>> # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.25259: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.25287: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca582930><<< 34006 1726882662.25325: stdout chunk (state=3): >>> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca582750> <<< 34006 1726882662.25391: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca582060> <<< 34006 1726882662.25421: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 34006 1726882662.25445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 34006 1726882662.25532: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca582a80> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca553aa0> import 'atexit' # <<< 34006 1726882662.25568: stdout chunk (state=3): >>> # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.25582: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.25640: stdout chunk (state=3): >>>import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca5836b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.25668: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca5838f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 34006 1726882662.25747: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc'<<< 34006 1726882662.25754: stdout chunk (state=3): >>> <<< 34006 1726882662.25765: stdout chunk (state=3): >>>import '_locale' # <<< 34006 1726882662.25833: stdout chunk (state=3): >>> import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca583e00> <<< 34006 1726882662.25868: stdout chunk (state=3): >>>import 'pwd' # <<< 34006 1726882662.25899: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 34006 1726882662.25942: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 34006 1726882662.26003: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f29b50><<< 34006 1726882662.26037: stdout chunk (state=3): >>> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 34006 1726882662.26050: stdout chunk (state=3): >>> <<< 34006 1726882662.26062: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.26065: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9f2b770><<< 34006 1726882662.26101: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 34006 1726882662.26130: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc'<<< 34006 1726882662.26189: stdout chunk (state=3): >>> import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f2c140><<< 34006 1726882662.26200: stdout chunk (state=3): >>> <<< 34006 1726882662.26219: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py<<< 34006 1726882662.26225: stdout chunk (state=3): >>> <<< 34006 1726882662.26276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc'<<< 34006 1726882662.26279: stdout chunk (state=3): >>> <<< 34006 1726882662.26310: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f2d010> <<< 34006 1726882662.26395: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc'<<< 34006 1726882662.26401: stdout chunk (state=3): >>> <<< 34006 1726882662.26429: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py<<< 34006 1726882662.26440: stdout chunk (state=3): >>> <<< 34006 1726882662.26449: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 34006 1726882662.26532: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f2fd70> <<< 34006 1726882662.26585: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 34006 1726882662.26623: stdout chunk (state=3): >>> import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca533170> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f2e030> <<< 34006 1726882662.26657: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 34006 1726882662.26730: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 34006 1726882662.26746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc'<<< 34006 1726882662.26774: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py<<< 34006 1726882662.26782: stdout chunk (state=3): >>> <<< 34006 1726882662.26948: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 34006 1726882662.26987: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 34006 1726882662.26999: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 34006 1726882662.27037: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f37b90> import '_tokenize' # <<< 34006 1726882662.27159: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f36660> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f363f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 34006 1726882662.27182: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 34006 1726882662.27347: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f36930> <<< 34006 1726882662.27353: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f2e540> <<< 34006 1726882662.27408: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 34006 1726882662.27425: stdout chunk (state=3): >>> # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 34006 1726882662.27471: stdout chunk (state=3): >>> import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9f7be00> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882662.27482: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f7bfe0> <<< 34006 1726882662.27531: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 34006 1726882662.27565: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py<<< 34006 1726882662.27629: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 34006 1726882662.27662: stdout chunk (state=3): >>> import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9f7da00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f7d7c0> <<< 34006 1726882662.27736: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc'<<< 34006 1726882662.27755: stdout chunk (state=3): >>> <<< 34006 1726882662.27814: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 34006 1726882662.27846: stdout chunk (state=3): >>> import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9f7ff50> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f7e0f0><<< 34006 1726882662.27884: stdout chunk (state=3): >>> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34006 1726882662.27967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc'<<< 34006 1726882662.28009: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 34006 1726882662.28015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc'<<< 34006 1726882662.28041: stdout chunk (state=3): >>> import '_string' # <<< 34006 1726882662.28286: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f836e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f7ffb0> <<< 34006 1726882662.28408: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 34006 1726882662.28460: stdout chunk (state=3): >>> import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9f847a0> <<< 34006 1726882662.28478: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9f84740> <<< 34006 1726882662.28560: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.28563: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.28596: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9f84a40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f7c0e0> <<< 34006 1726882662.28633: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py<<< 34006 1726882662.28644: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 34006 1726882662.28683: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 34006 1726882662.28723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 34006 1726882662.28732: stdout chunk (state=3): >>> <<< 34006 1726882662.28766: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.28822: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9e101a0> <<< 34006 1726882662.29100: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.29121: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9e112e0><<< 34006 1726882662.29147: stdout chunk (state=3): >>> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f86960> <<< 34006 1726882662.29177: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9f87ce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f865a0><<< 34006 1726882662.29185: stdout chunk (state=3): >>> # zipimport: zlib available <<< 34006 1726882662.29208: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 34006 1726882662.29239: stdout chunk (state=3): >>> # zipimport: zlib available <<< 34006 1726882662.29371: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.29510: stdout chunk (state=3): >>> # zipimport: zlib available <<< 34006 1726882662.29533: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.29538: stdout chunk (state=3): >>> <<< 34006 1726882662.29552: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 34006 1726882662.29580: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.29589: stdout chunk (state=3): >>> <<< 34006 1726882662.29611: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.29623: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 34006 1726882662.29652: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.29838: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.29843: stdout chunk (state=3): >>> <<< 34006 1726882662.30025: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.30031: stdout chunk (state=3): >>> <<< 34006 1726882662.30939: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.31800: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 34006 1726882662.31813: stdout chunk (state=3): >>> <<< 34006 1726882662.31818: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 34006 1726882662.31843: stdout chunk (state=3): >>> import 'ansible.module_utils.six.moves.collections_abc' # <<< 34006 1726882662.31853: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text.converters' # <<< 34006 1726882662.31888: stdout chunk (state=3): >>> # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 34006 1726882662.31922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882662.31998: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 34006 1726882662.32031: stdout chunk (state=3): >>> # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9e15490> <<< 34006 1726882662.32128: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 34006 1726882662.32142: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc'<<< 34006 1726882662.32236: stdout chunk (state=3): >>> <<< 34006 1726882662.32254: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9e162d0><<< 34006 1726882662.32263: stdout chunk (state=3): >>> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f849e0><<< 34006 1726882662.32327: stdout chunk (state=3): >>> import 'ansible.module_utils.compat.selinux' # <<< 34006 1726882662.32356: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.32401: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.32423: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 34006 1726882662.32449: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.32456: stdout chunk (state=3): >>> <<< 34006 1726882662.32833: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.32911: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 34006 1726882662.32942: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 34006 1726882662.32960: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9e16ab0> <<< 34006 1726882662.32983: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.33745: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.34494: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.34500: stdout chunk (state=3): >>> <<< 34006 1726882662.34615: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.34620: stdout chunk (state=3): >>> <<< 34006 1726882662.34720: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 34006 1726882662.34728: stdout chunk (state=3): >>> <<< 34006 1726882662.34746: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.34754: stdout chunk (state=3): >>> <<< 34006 1726882662.34805: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.34810: stdout chunk (state=3): >>> <<< 34006 1726882662.34857: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 34006 1726882662.34883: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.35110: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # <<< 34006 1726882662.35114: stdout chunk (state=3): >>> <<< 34006 1726882662.35139: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.35148: stdout chunk (state=3): >>> <<< 34006 1726882662.35166: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.35180: stdout chunk (state=3): >>> <<< 34006 1726882662.35187: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 34006 1726882662.35212: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.35219: stdout chunk (state=3): >>> <<< 34006 1726882662.35273: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.35278: stdout chunk (state=3): >>> <<< 34006 1726882662.35331: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 34006 1726882662.35336: stdout chunk (state=3): >>> <<< 34006 1726882662.35358: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.35363: stdout chunk (state=3): >>> <<< 34006 1726882662.35743: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.35748: stdout chunk (state=3): >>> <<< 34006 1726882662.36123: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py<<< 34006 1726882662.36128: stdout chunk (state=3): >>> <<< 34006 1726882662.36228: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc'<<< 34006 1726882662.36233: stdout chunk (state=3): >>> <<< 34006 1726882662.36255: stdout chunk (state=3): >>>import '_ast' # <<< 34006 1726882662.36333: stdout chunk (state=3): >>> <<< 34006 1726882662.36377: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9e17620><<< 34006 1726882662.36383: stdout chunk (state=3): >>> <<< 34006 1726882662.36403: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.36507: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.36615: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 34006 1726882662.36623: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 34006 1726882662.36640: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 34006 1726882662.36659: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 34006 1726882662.36733: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34006 1726882662.36778: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 34006 1726882662.36800: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.36854: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.36937: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.36985: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.37051: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 34006 1726882662.37104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882662.37199: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.37202: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9e22090> <<< 34006 1726882662.37240: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9e1d880> <<< 34006 1726882662.37265: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 34006 1726882662.37335: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.37403: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.37424: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.37472: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882662.37511: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 34006 1726882662.37529: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 34006 1726882662.37545: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 34006 1726882662.37596: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 34006 1726882662.37627: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 34006 1726882662.37639: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 34006 1726882662.37683: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f0a9f0> <<< 34006 1726882662.37726: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca5ae6c0> <<< 34006 1726882662.37822: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9e22150> <<< 34006 1726882662.37827: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9e17e90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 34006 1726882662.37842: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.37875: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 34006 1726882662.38151: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34006 1726882662.38184: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.38235: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.38295: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.38367: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.38396: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 34006 1726882662.38504: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.38632: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34006 1726882662.38669: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 34006 1726882662.38686: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.38967: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.39239: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.39297: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.39360: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882662.39389: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 34006 1726882662.39442: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 34006 1726882662.39466: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9eb20c0> <<< 34006 1726882662.39492: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 34006 1726882662.39502: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 34006 1726882662.39585: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 34006 1726882662.39602: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 34006 1726882662.39625: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 34006 1726882662.39636: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9a3c170> <<< 34006 1726882662.39668: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.39722: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.39727: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9a3c740> <<< 34006 1726882662.39776: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9e98920> <<< 34006 1726882662.39782: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9eb2c00> <<< 34006 1726882662.39846: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9eb07a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9eb0200> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 34006 1726882662.39907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 34006 1726882662.39938: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 34006 1726882662.39943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc'<<< 34006 1726882662.39979: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py<<< 34006 1726882662.39983: stdout chunk (state=3): >>> <<< 34006 1726882662.39997: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc'<<< 34006 1726882662.40043: stdout chunk (state=3): >>> # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.40047: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.40064: stdout chunk (state=3): >>>import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9a3f500><<< 34006 1726882662.40070: stdout chunk (state=3): >>> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9a3edb0> <<< 34006 1726882662.40124: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9a3ef90><<< 34006 1726882662.40131: stdout chunk (state=3): >>> <<< 34006 1726882662.40172: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9a3e1e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 34006 1726882662.40359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc'<<< 34006 1726882662.40375: stdout chunk (state=3): >>> import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9a3f6e0><<< 34006 1726882662.40415: stdout chunk (state=3): >>> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 34006 1726882662.40464: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 34006 1726882662.40505: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so'<<< 34006 1726882662.40519: stdout chunk (state=3): >>> # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9a9e1e0><<< 34006 1726882662.40535: stdout chunk (state=3): >>> <<< 34006 1726882662.40605: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9a9c230> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9eb1880><<< 34006 1726882662.40616: stdout chunk (state=3): >>> <<< 34006 1726882662.40622: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 34006 1726882662.40643: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 34006 1726882662.40671: stdout chunk (state=3): >>> # zipimport: zlib available <<< 34006 1726882662.40706: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 34006 1726882662.40730: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34006 1726882662.40739: stdout chunk (state=3): >>> <<< 34006 1726882662.40812: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.40894: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 34006 1726882662.40910: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.40966: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.41017: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 34006 1726882662.41042: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34006 1726882662.41053: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # <<< 34006 1726882662.41067: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.41099: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.41130: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 34006 1726882662.41143: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.41218: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.41245: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 34006 1726882662.41279: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.41306: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.41351: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 34006 1726882662.41362: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.41420: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.41470: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.41534: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.41599: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 34006 1726882662.41603: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 34006 1726882662.41612: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.42245: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.42928: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 34006 1726882662.42991: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.43084: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.43164: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.43167: stdout chunk (state=3): >>> <<< 34006 1726882662.43235: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 34006 1726882662.43251: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 34006 1726882662.43315: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34006 1726882662.43376: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 34006 1726882662.43401: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882662.43495: stdout chunk (state=3): >>> # zipimport: zlib available <<< 34006 1726882662.43581: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 34006 1726882662.43624: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.43686: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.43731: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 34006 1726882662.43768: stdout chunk (state=3): >>> # zipimport: zlib available <<< 34006 1726882662.43860: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # <<< 34006 1726882662.43904: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.44189: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 34006 1726882662.44223: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9a9f680><<< 34006 1726882662.44235: stdout chunk (state=3): >>> <<< 34006 1726882662.44271: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py<<< 34006 1726882662.44289: stdout chunk (state=3): >>> <<< 34006 1726882662.44334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc'<<< 34006 1726882662.44337: stdout chunk (state=3): >>> <<< 34006 1726882662.44603: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9a9ef30> <<< 34006 1726882662.44611: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 34006 1726882662.44636: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.44695: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 34006 1726882662.44882: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.44906: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 34006 1726882662.44950: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.45022: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 34006 1726882662.45070: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.45121: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 34006 1726882662.45220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 34006 1726882662.45239: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.45290: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9ada4b0> <<< 34006 1726882662.45632: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9aca180> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 34006 1726882662.45684: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.45770: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.45879: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.46022: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 34006 1726882662.46067: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.46119: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 34006 1726882662.46164: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.46297: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9aedee0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9aeda90> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 34006 1726882662.46342: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.46391: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 34006 1726882662.46536: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.46685: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 34006 1726882662.46718: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.46837: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.46889: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.46941: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.46965: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 34006 1726882662.46995: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 34006 1726882662.47162: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34006 1726882662.47186: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.47308: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 34006 1726882662.47328: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.47436: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.47557: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 34006 1726882662.47718: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.47721: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.48180: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.48704: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 34006 1726882662.48721: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 34006 1726882662.48801: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.48905: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 34006 1726882662.48921: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.49010: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.49118: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 34006 1726882662.49122: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.49259: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.49413: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 34006 1726882662.49449: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 34006 1726882662.49495: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.49543: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 34006 1726882662.49547: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.49635: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.49731: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.49924: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.50141: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 34006 1726882662.50152: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.50174: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.50212: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 34006 1726882662.50250: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.50273: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 34006 1726882662.50354: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.50413: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 34006 1726882662.50492: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.50578: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 34006 1726882662.50798: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 34006 1726882662.50922: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.51125: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 34006 1726882662.51128: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.51227: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 34006 1726882662.51503: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 34006 1726882662.51585: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # <<< 34006 1726882662.51590: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.51757: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 34006 1726882662.51811: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34006 1726882662.51856: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 34006 1726882662.51911: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.51914: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34006 1726882662.52016: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34006 1726882662.52071: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.52157: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 34006 1726882662.52170: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.52218: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.52257: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 34006 1726882662.52458: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.52641: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 34006 1726882662.52670: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.52693: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.52800: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 34006 1726882662.52803: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.52841: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 34006 1726882662.52855: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.53015: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 34006 1726882662.53111: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.53212: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 34006 1726882662.53222: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 34006 1726882662.53272: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.53889: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 34006 1726882662.54012: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 34006 1726882662.54016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 34006 1726882662.54036: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c98eb050> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c98eb4d0> <<< 34006 1726882662.54049: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c98e8bc0> <<< 34006 1726882662.54802: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "42", "epoch": "1726882662", "epoch_int": "1726882662", "date": "2024-09-20", "time": "21:37:42", "iso8601_micro": "2024-09-21T01:37:42.533664Z", "iso8601": "2024-09-21T01:37:42Z", "iso8601_basic": "20240920T213742533664", "iso8601_basic_short": "20240920T213742", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34006 1726882662.55367: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 34006 1726882662.55442: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections <<< 34006 1726882662.55462: stdout chunk (state=3): >>># cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression <<< 34006 1726882662.55477: stdout chunk (state=3): >>># cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize<<< 34006 1726882662.55561: stdout chunk (state=3): >>> # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters <<< 34006 1726882662.55598: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd <<< 34006 1726882662.55669: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 34006 1726882662.56013: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 34006 1726882662.56065: stdout chunk (state=3): >>># destroy ntpath <<< 34006 1726882662.56126: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp <<< 34006 1726882662.56207: stdout chunk (state=3): >>># destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil <<< 34006 1726882662.56268: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 34006 1726882662.56352: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json <<< 34006 1726882662.56428: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 34006 1726882662.56483: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib <<< 34006 1726882662.56580: stdout chunk (state=3): >>># cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 34006 1726882662.56583: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34006 1726882662.56730: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections <<< 34006 1726882662.56833: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 34006 1726882662.56837: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 34006 1726882662.56975: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34006 1726882662.57012: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib <<< 34006 1726882662.57043: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix <<< 34006 1726882662.57065: stdout chunk (state=3): >>># destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 34006 1726882662.57428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34006 1726882662.57631: stderr chunk (state=3): >>>Shared connection to 10.31.10.229 closed. <<< 34006 1726882662.57635: stdout chunk (state=3): >>><<< 34006 1726882662.57638: stderr chunk (state=3): >>><<< 34006 1726882662.57810: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21caa184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca9e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21caa1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca82d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca82dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca86be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca86bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8a3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca883b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca881280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca869040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8c3740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8c2360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca882120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8c0b90> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8f87a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8682c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca8f8c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8f8b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca8f8ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca866de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8f9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8f9250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8fa480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca9106b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca911dc0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca912c30> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca913260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca912180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca913ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca913410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8fa4e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca60fb90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca6386b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca638410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca6386e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca638fe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca639970> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca638890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca60dd60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca63ad20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca639850> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca8fabd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca667080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca68b470> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca6e8260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca6ea9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca6e8380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca6b5250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca4e9340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca68a270> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca63bc50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f21ca68a870> # zipimport: found 103 names in '/tmp/ansible_setup_payload_op2f2sib/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca553080> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca531f70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca531100> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca550f20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca582930> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca582750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca582060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca582a80> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca553aa0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca5836b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca5838f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca583e00> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f29b50> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9f2b770> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f2c140> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f2d010> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f2fd70> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21ca533170> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f2e030> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f37b90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f36660> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f363f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f36930> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f2e540> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9f7be00> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f7bfe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9f7da00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f7d7c0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9f7ff50> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f7e0f0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f836e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f7ffb0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9f847a0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9f84740> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9f84a40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f7c0e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9e101a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9e112e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f86960> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9f87ce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f865a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9e15490> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9e162d0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f849e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9e16ab0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9e17620> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9e22090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9e1d880> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9f0a9f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21ca5ae6c0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9e22150> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9e17e90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9eb20c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9a3c170> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9a3c740> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9e98920> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9eb2c00> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9eb07a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9eb0200> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9a3f500> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9a3edb0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9a3ef90> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9a3e1e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9a3f6e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9a9e1e0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9a9c230> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9eb1880> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9a9f680> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9a9ef30> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9ada4b0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9aca180> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c9aedee0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c9aeda90> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f21c98eb050> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c98eb4d0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f21c98e8bc0> {"ansible_facts": {"ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "37", "second": "42", "epoch": "1726882662", "epoch_int": "1726882662", "date": "2024-09-20", "time": "21:37:42", "iso8601_micro": "2024-09-21T01:37:42.533664Z", "iso8601": "2024-09-21T01:37:42Z", "iso8601_basic": "20240920T213742533664", "iso8601_basic_short": "20240920T213742", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 34006 1726882662.59691: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882661.9804232-34103-64351694051563/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34006 1726882662.59697: _low_level_execute_command(): starting 34006 1726882662.59700: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882661.9804232-34103-64351694051563/ > /dev/null 2>&1 && sleep 0' 34006 1726882662.59702: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34006 1726882662.59705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882662.59707: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882662.59759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 34006 1726882662.59794: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34006 1726882662.59845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882662.59958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34006 1726882662.61926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34006 1726882662.61929: stdout chunk (state=3): >>><<< 34006 1726882662.61931: stderr chunk (state=3): >>><<< 34006 1726882662.61946: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34006 1726882662.61956: handler run complete 34006 1726882662.62098: variable 'ansible_facts' from source: unknown 34006 1726882662.62101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882662.62185: variable 'ansible_facts' from source: unknown 34006 1726882662.62247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882662.62310: attempt loop complete, returning result 34006 1726882662.62324: _execute() done 34006 1726882662.62334: dumping result to json 34006 1726882662.62350: done dumping result, returning 34006 1726882662.62436: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [12673a56-9f93-11ce-7734-000000000158] 34006 1726882662.62440: sending task result for task 12673a56-9f93-11ce-7734-000000000158 ok: [managed_node3] 34006 1726882662.62656: no more pending results, returning what we have 34006 1726882662.62659: results queue empty 34006 1726882662.62660: checking for any_errors_fatal 34006 1726882662.62661: done checking for any_errors_fatal 34006 1726882662.62662: checking for max_fail_percentage 34006 1726882662.62663: done checking for max_fail_percentage 34006 1726882662.62664: checking to see if all hosts have failed and the running result is not ok 34006 1726882662.62665: done checking to see if all hosts have failed 34006 1726882662.62666: getting the remaining hosts for this loop 34006 1726882662.62667: done getting the remaining hosts for this loop 34006 1726882662.62671: getting the next task for host managed_node3 34006 1726882662.62679: done getting next task for host managed_node3 34006 1726882662.62682: ^ task is: TASK: Check if system is ostree 34006 1726882662.62685: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882662.62690: getting variables 34006 1726882662.62692: in VariableManager get_vars() 34006 1726882662.62721: Calling all_inventory to load vars for managed_node3 34006 1726882662.62723: Calling groups_inventory to load vars for managed_node3 34006 1726882662.62726: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882662.62738: Calling all_plugins_play to load vars for managed_node3 34006 1726882662.62740: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882662.62743: Calling groups_plugins_play to load vars for managed_node3 34006 1726882662.63313: done sending task result for task 12673a56-9f93-11ce-7734-000000000158 34006 1726882662.63317: WORKER PROCESS EXITING 34006 1726882662.63370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882662.63904: done with get_vars() 34006 1726882662.63918: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:37:42 -0400 (0:00:00.743) 0:00:02.417 ****** 34006 1726882662.64115: entering _queue_task() for managed_node3/stat 34006 1726882662.64643: worker is 1 (out of 1 available) 34006 1726882662.64658: exiting _queue_task() for managed_node3/stat 34006 1726882662.64670: done queuing things up, now waiting for results queue to drain 34006 1726882662.64671: waiting for pending results... 34006 1726882662.65336: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 34006 1726882662.65381: in run() - task 12673a56-9f93-11ce-7734-00000000015a 34006 1726882662.65418: variable 'ansible_search_path' from source: unknown 34006 1726882662.65639: variable 'ansible_search_path' from source: unknown 34006 1726882662.65643: calling self._execute() 34006 1726882662.65856: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882662.65861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882662.65864: variable 'omit' from source: magic vars 34006 1726882662.66729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34006 1726882662.66931: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34006 1726882662.66990: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34006 1726882662.67031: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34006 1726882662.67078: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34006 1726882662.67182: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34006 1726882662.67204: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34006 1726882662.67223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34006 1726882662.67245: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34006 1726882662.67342: Evaluated conditional (not __network_is_ostree is defined): True 34006 1726882662.67345: variable 'omit' from source: magic vars 34006 1726882662.67372: variable 'omit' from source: magic vars 34006 1726882662.67407: variable 'omit' from source: magic vars 34006 1726882662.67425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34006 1726882662.67445: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34006 1726882662.67463: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34006 1726882662.67475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34006 1726882662.67492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34006 1726882662.67510: variable 'inventory_hostname' from source: host vars for 'managed_node3' 34006 1726882662.67513: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882662.67515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882662.67579: Set connection var ansible_pipelining to False 34006 1726882662.67583: Set connection var ansible_shell_executable to /bin/sh 34006 1726882662.67592: Set connection var ansible_timeout to 10 34006 1726882662.67596: Set connection var ansible_connection to ssh 34006 1726882662.67605: Set connection var ansible_module_compression to ZIP_DEFLATED 34006 1726882662.67608: Set connection var ansible_shell_type to sh 34006 1726882662.67621: variable 'ansible_shell_executable' from source: unknown 34006 1726882662.67624: variable 'ansible_connection' from source: unknown 34006 1726882662.67627: variable 'ansible_module_compression' from source: unknown 34006 1726882662.67629: variable 'ansible_shell_type' from source: unknown 34006 1726882662.67631: variable 'ansible_shell_executable' from source: unknown 34006 1726882662.67633: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882662.67636: variable 'ansible_pipelining' from source: unknown 34006 1726882662.67639: variable 'ansible_timeout' from source: unknown 34006 1726882662.67643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882662.67741: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34006 1726882662.67749: variable 'omit' from source: magic vars 34006 1726882662.67754: starting attempt loop 34006 1726882662.67756: running the handler 34006 1726882662.67767: _low_level_execute_command(): starting 34006 1726882662.67774: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34006 1726882662.68251: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882662.68255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882662.68260: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 34006 1726882662.68263: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882662.68336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 34006 1726882662.68377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882662.68424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34006 1726882662.70460: stdout chunk (state=3): >>>/root <<< 34006 1726882662.70605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34006 1726882662.70627: stderr chunk (state=3): >>><<< 34006 1726882662.70630: stdout chunk (state=3): >>><<< 34006 1726882662.70646: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34006 1726882662.70659: _low_level_execute_command(): starting 34006 1726882662.70665: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882662.7064602-34143-201160923202225 `" && echo ansible-tmp-1726882662.7064602-34143-201160923202225="` echo /root/.ansible/tmp/ansible-tmp-1726882662.7064602-34143-201160923202225 `" ) && sleep 0' 34006 1726882662.71071: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882662.71074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882662.71077: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882662.71079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882662.71126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 34006 1726882662.71130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882662.71186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34006 1726882662.73784: stdout chunk (state=3): >>>ansible-tmp-1726882662.7064602-34143-201160923202225=/root/.ansible/tmp/ansible-tmp-1726882662.7064602-34143-201160923202225 <<< 34006 1726882662.73962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34006 1726882662.74012: stderr chunk (state=3): >>><<< 34006 1726882662.74016: stdout chunk (state=3): >>><<< 34006 1726882662.74024: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882662.7064602-34143-201160923202225=/root/.ansible/tmp/ansible-tmp-1726882662.7064602-34143-201160923202225 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34006 1726882662.74085: variable 'ansible_module_compression' from source: unknown 34006 1726882662.74158: ANSIBALLZ: Using lock for stat 34006 1726882662.74161: ANSIBALLZ: Acquiring lock 34006 1726882662.74163: ANSIBALLZ: Lock acquired: 139800307881200 34006 1726882662.74165: ANSIBALLZ: Creating module 34006 1726882662.82788: ANSIBALLZ: Writing module into payload 34006 1726882662.82855: ANSIBALLZ: Writing module 34006 1726882662.82873: ANSIBALLZ: Renaming module 34006 1726882662.82878: ANSIBALLZ: Done creating module 34006 1726882662.82896: variable 'ansible_facts' from source: unknown 34006 1726882662.82939: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882662.7064602-34143-201160923202225/AnsiballZ_stat.py 34006 1726882662.83039: Sending initial data 34006 1726882662.83043: Sent initial data (153 bytes) 34006 1726882662.83489: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882662.83497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882662.83502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 34006 1726882662.83504: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882662.83506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882662.83558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 34006 1726882662.83561: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34006 1726882662.83567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882662.83630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34006 1726882662.85957: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 34006 1726882662.85960: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34006 1726882662.86010: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34006 1726882662.86072: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-340060gu6lfii/tmpj_hol5oo /root/.ansible/tmp/ansible-tmp-1726882662.7064602-34143-201160923202225/AnsiballZ_stat.py <<< 34006 1726882662.86077: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882662.7064602-34143-201160923202225/AnsiballZ_stat.py" <<< 34006 1726882662.86122: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-340060gu6lfii/tmpj_hol5oo" to remote "/root/.ansible/tmp/ansible-tmp-1726882662.7064602-34143-201160923202225/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882662.7064602-34143-201160923202225/AnsiballZ_stat.py" <<< 34006 1726882662.86706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34006 1726882662.86741: stderr chunk (state=3): >>><<< 34006 1726882662.86748: stdout chunk (state=3): >>><<< 34006 1726882662.86781: done transferring module to remote 34006 1726882662.86795: _low_level_execute_command(): starting 34006 1726882662.86798: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882662.7064602-34143-201160923202225/ /root/.ansible/tmp/ansible-tmp-1726882662.7064602-34143-201160923202225/AnsiballZ_stat.py && sleep 0' 34006 1726882662.87223: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882662.87226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882662.87228: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882662.87230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882662.87302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 34006 1726882662.87305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882662.87352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34006 1726882662.89753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34006 1726882662.89757: stdout chunk (state=3): >>><<< 34006 1726882662.89759: stderr chunk (state=3): >>><<< 34006 1726882662.89762: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34006 1726882662.89764: _low_level_execute_command(): starting 34006 1726882662.89770: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882662.7064602-34143-201160923202225/AnsiballZ_stat.py && sleep 0' 34006 1726882662.90199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882662.90202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882662.90205: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 34006 1726882662.90207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882662.90208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882662.90253: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 34006 1726882662.90256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882662.90317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34006 1726882662.92983: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34006 1726882662.93017: stdout chunk (state=3): >>>import _imp # builtin <<< 34006 1726882662.93111: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 34006 1726882662.93141: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 34006 1726882662.93163: stdout chunk (state=3): >>>import 'posix' # <<< 34006 1726882662.93228: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 34006 1726882662.93246: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 34006 1726882662.93288: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882662.93309: stdout chunk (state=3): >>>import '_codecs' # <<< 34006 1726882662.93342: stdout chunk (state=3): >>>import 'codecs' # <<< 34006 1726882662.93371: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 34006 1726882662.93406: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f3bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f38bb00> <<< 34006 1726882662.93453: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 34006 1726882662.93483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f3bea50> import '_signal' # <<< 34006 1726882662.93521: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 34006 1726882662.93524: stdout chunk (state=3): >>>import 'io' # <<< 34006 1726882662.93557: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 34006 1726882662.93647: stdout chunk (state=3): >>>import '_collections_abc' # <<< 34006 1726882662.93678: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 34006 1726882662.93729: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages <<< 34006 1726882662.93756: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 34006 1726882662.93772: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 34006 1726882662.93812: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 34006 1726882662.93830: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f3cd130> <<< 34006 1726882662.93886: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 34006 1726882662.93911: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f3cdfa0> <<< 34006 1726882662.93936: stdout chunk (state=3): >>>import 'site' # <<< 34006 1726882662.93960: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34006 1726882662.94190: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 34006 1726882662.94216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 34006 1726882662.94230: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882662.94248: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 34006 1726882662.94289: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 34006 1726882662.94302: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 34006 1726882662.94333: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 34006 1726882662.94346: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f1cbe90> <<< 34006 1726882662.94386: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 34006 1726882662.94412: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f1cbf50> <<< 34006 1726882662.94429: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 34006 1726882662.94453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 34006 1726882662.94471: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 34006 1726882662.94530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882662.94542: stdout chunk (state=3): >>>import 'itertools' # <<< 34006 1726882662.94574: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 34006 1726882662.94626: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f203890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 34006 1726882662.94636: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f203f20> import '_collections' # <<< 34006 1726882662.94674: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f1e3b60> <<< 34006 1726882662.94721: stdout chunk (state=3): >>>import '_functools' # <<< 34006 1726882662.94729: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f1e1280> <<< 34006 1726882662.95266: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f1c9040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f223800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f222420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f1e2150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f220b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f258860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f1c82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02f258d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f258bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02f258f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f1c6de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f259610> <<< 34006 1726882662.95276: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f2592e0> import 'importlib.machinery' # <<< 34006 1726882662.95311: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 34006 1726882662.95325: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f25a510> <<< 34006 1726882662.95348: stdout chunk (state=3): >>>import 'importlib.util' # <<< 34006 1726882662.95353: stdout chunk (state=3): >>>import 'runpy' # <<< 34006 1726882662.95384: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 34006 1726882662.95423: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 34006 1726882662.95457: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 34006 1726882662.95460: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f270710> <<< 34006 1726882662.95577: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02f271df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 34006 1726882662.95586: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f272c90> <<< 34006 1726882662.95623: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.95625: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02f2732f0> <<< 34006 1726882662.95668: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f2721e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 34006 1726882662.95707: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.95736: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02f273d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f2734a0> <<< 34006 1726882662.95778: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f25a540> <<< 34006 1726882662.95836: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 34006 1726882662.95852: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 34006 1726882662.95962: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02effbc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.95965: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02f024710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f024470> <<< 34006 1726882662.95995: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02f024590> <<< 34006 1726882662.96036: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 34006 1726882662.96038: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 34006 1726882662.96124: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.96305: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882662.96310: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02f025010> <<< 34006 1726882662.96454: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 34006 1726882662.96540: stdout chunk (state=3): >>> # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02f0259d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f0248c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02eff9df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 34006 1726882662.96597: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 34006 1726882662.96606: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 34006 1726882662.96619: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f026de0> <<< 34006 1726882662.96670: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f025b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f25ac30> <<< 34006 1726882662.96700: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34006 1726882662.96800: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 34006 1726882662.96879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f053140> <<< 34006 1726882662.96955: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py<<< 34006 1726882662.96960: stdout chunk (state=3): >>> <<< 34006 1726882662.96985: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc'<<< 34006 1726882662.96990: stdout chunk (state=3): >>> <<< 34006 1726882662.97013: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py<<< 34006 1726882662.97049: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 34006 1726882662.97114: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f0734d0><<< 34006 1726882662.97150: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py<<< 34006 1726882662.97154: stdout chunk (state=3): >>> <<< 34006 1726882662.97222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc'<<< 34006 1726882662.97227: stdout chunk (state=3): >>> <<< 34006 1726882662.97307: stdout chunk (state=3): >>>import 'ntpath' # <<< 34006 1726882662.97348: stdout chunk (state=3): >>> # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 34006 1726882662.97355: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc'<<< 34006 1726882662.97361: stdout chunk (state=3): >>> import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f0d4260><<< 34006 1726882662.97396: stdout chunk (state=3): >>> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 34006 1726882662.97447: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 34006 1726882662.97539: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 34006 1726882662.97675: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f0d69c0> <<< 34006 1726882662.97787: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f0d4380> <<< 34006 1726882662.97839: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f099250> <<< 34006 1726882662.97879: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e9213a0> <<< 34006 1726882662.97906: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f0722d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f027d40> <<< 34006 1726882662.98024: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 34006 1726882662.98027: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa02f072630> <<< 34006 1726882662.98222: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_oenvlos4/ansible_stat_payload.zip' # zipimport: zlib available <<< 34006 1726882662.98363: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.98395: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 34006 1726882662.98433: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34006 1726882662.98501: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 34006 1726882662.98540: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 34006 1726882662.98543: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e976fc0> import '_typing' # <<< 34006 1726882662.98812: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e955eb0> <<< 34006 1726882662.98818: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e955070> # zipimport: zlib available <<< 34006 1726882662.98841: stdout chunk (state=3): >>>import 'ansible' # <<< 34006 1726882662.98861: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.98880: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.98891: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882662.98898: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 34006 1726882662.98918: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.01117: stdout chunk (state=3): >>># zipimport: zlib available<<< 34006 1726882663.01249: stdout chunk (state=3): >>> <<< 34006 1726882663.02437: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 34006 1726882663.02484: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e974e90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882663.02490: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 34006 1726882663.02517: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 34006 1726882663.02563: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e9a29c0> <<< 34006 1726882663.02597: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e9a2750> <<< 34006 1726882663.02628: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e9a2060> <<< 34006 1726882663.02657: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 34006 1726882663.02717: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e9a2ab0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e977c50> <<< 34006 1726882663.02721: stdout chunk (state=3): >>>import 'atexit' # <<< 34006 1726882663.02755: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e9a3680> <<< 34006 1726882663.02794: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e9a3860> <<< 34006 1726882663.02798: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 34006 1726882663.02848: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 34006 1726882663.02851: stdout chunk (state=3): >>>import '_locale' # <<< 34006 1726882663.02907: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e9a3da0> import 'pwd' # <<< 34006 1726882663.02938: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 34006 1726882663.02951: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 34006 1726882663.02998: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e809bb0> <<< 34006 1726882663.03049: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e80b7d0> <<< 34006 1726882663.03052: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 34006 1726882663.03061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 34006 1726882663.03092: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e80c1a0> <<< 34006 1726882663.03123: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 34006 1726882663.03150: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 34006 1726882663.03174: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e80d0a0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 34006 1726882663.03230: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 34006 1726882663.03243: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 34006 1726882663.03292: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e80fdd0> <<< 34006 1726882663.03326: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e9570b0> <<< 34006 1726882663.03361: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e80e090> <<< 34006 1726882663.03385: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 34006 1726882663.03417: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 34006 1726882663.03445: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 34006 1726882663.03484: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 34006 1726882663.03516: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 34006 1726882663.03526: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e817c50> import '_tokenize' # <<< 34006 1726882663.03597: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e816720> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e816480> <<< 34006 1726882663.03627: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 34006 1726882663.03698: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8169f0> <<< 34006 1726882663.03743: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e80e5a0> <<< 34006 1726882663.03756: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e85fef0> <<< 34006 1726882663.03790: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882663.03835: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e85ffb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 34006 1726882663.03840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 34006 1726882663.03899: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 34006 1726882663.03909: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e861af0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8618b0> <<< 34006 1726882663.03922: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34006 1726882663.04035: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 34006 1726882663.04088: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e863fe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8621e0> <<< 34006 1726882663.04108: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34006 1726882663.04149: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882663.04180: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 34006 1726882663.04202: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 34006 1726882663.04236: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8677a0> <<< 34006 1726882663.04363: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e864260> <<< 34006 1726882663.04422: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e868650> <<< 34006 1726882663.04449: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e8686b0> <<< 34006 1726882663.04531: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e868b60> <<< 34006 1726882663.04539: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8601d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 34006 1726882663.04598: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 34006 1726882663.04624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 34006 1726882663.04628: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e8f4260> <<< 34006 1726882663.04808: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e8f5280> <<< 34006 1726882663.04824: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e86a9f0> <<< 34006 1726882663.04859: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e86bda0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e86a600> <<< 34006 1726882663.04897: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 34006 1726882663.04901: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.04972: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.05077: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 34006 1726882663.05118: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 34006 1726882663.05136: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.05246: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.05364: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.05902: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.06469: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 34006 1726882663.06473: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 34006 1726882663.06499: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882663.06547: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 34006 1726882663.06570: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e8f9490> <<< 34006 1726882663.06626: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 34006 1726882663.06644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8fa240> <<< 34006 1726882663.06659: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8f5df0> <<< 34006 1726882663.06720: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 34006 1726882663.06744: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 34006 1726882663.06757: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.06895: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.07054: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8fa9f0> <<< 34006 1726882663.07070: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.07510: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.07942: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.08014: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.08099: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 34006 1726882663.08130: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.08165: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 34006 1726882663.08241: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.08353: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 34006 1726882663.08356: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.08372: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 34006 1726882663.08406: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.08448: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 34006 1726882663.08667: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.08897: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 34006 1726882663.08961: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 34006 1726882663.08965: stdout chunk (state=3): >>>import '_ast' # <<< 34006 1726882663.09038: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8fb5f0> <<< 34006 1726882663.09041: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.09109: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.09195: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 34006 1726882663.09199: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 34006 1726882663.09226: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 34006 1726882663.09256: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.09296: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 34006 1726882663.09309: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.09344: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.09383: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.09445: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.09506: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 34006 1726882663.09547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882663.09621: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e705f70> <<< 34006 1726882663.09659: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e7032f0> <<< 34006 1726882663.09699: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 34006 1726882663.09759: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.09817: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.09845: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.09895: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 34006 1726882663.09942: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 34006 1726882663.09945: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 34006 1726882663.10032: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 34006 1726882663.10035: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 34006 1726882663.10059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 34006 1726882663.10098: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e9d6930><<< 34006 1726882663.10116: stdout chunk (state=3): >>> <<< 34006 1726882663.10135: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e9ee600> <<< 34006 1726882663.10216: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e706060> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8faf60> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 34006 1726882663.10255: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.10286: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 34006 1726882663.10345: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 34006 1726882663.10372: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34006 1726882663.10399: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available <<< 34006 1726882663.10514: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.10696: stdout chunk (state=3): >>># zipimport: zlib available <<< 34006 1726882663.10816: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 34006 1726882663.11127: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc <<< 34006 1726882663.11168: stdout chunk (state=3): >>># clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread <<< 34006 1726882663.11171: stdout chunk (state=3): >>># cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack <<< 34006 1726882663.11235: stdout chunk (state=3): >>># cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings <<< 34006 1726882663.11240: stdout chunk (state=3): >>># cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap<<< 34006 1726882663.11274: stdout chunk (state=3): >>> # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 34006 1726882663.11551: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 34006 1726882663.11554: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma <<< 34006 1726882663.11605: stdout chunk (state=3): >>># destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath <<< 34006 1726882663.11638: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 34006 1726882663.11681: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal <<< 34006 1726882663.11703: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil <<< 34006 1726882663.11730: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 34006 1726882663.11764: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 34006 1726882663.11810: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc <<< 34006 1726882663.11833: stdout chunk (state=3): >>># cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator <<< 34006 1726882663.11874: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp <<< 34006 1726882663.11886: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34006 1726882663.12023: stdout chunk (state=3): >>># destroy sys.monitoring <<< 34006 1726882663.12053: stdout chunk (state=3): >>># destroy _socket # destroy _collections # destroy platform # destroy _uuid <<< 34006 1726882663.12075: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 34006 1726882663.12132: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 34006 1726882663.12135: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34006 1726882663.12220: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 34006 1726882663.12270: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib # destroy _operator <<< 34006 1726882663.12299: stdout chunk (state=3): >>># destroy _string # destroy re # destroy itertools <<< 34006 1726882663.12309: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 34006 1726882663.12707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 34006 1726882663.12711: stdout chunk (state=3): >>><<< 34006 1726882663.12713: stderr chunk (state=3): >>><<< 34006 1726882663.12838: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f3bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f38bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f3bea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f3cd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f3cdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f1cbe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f1cbf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f203890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f203f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f1e3b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f1e1280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f1c9040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f223800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f222420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f1e2150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f220b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f258860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f1c82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02f258d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f258bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02f258f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f1c6de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f259610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f2592e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f25a510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f270710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02f271df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f272c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02f2732f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f2721e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02f273d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f2734a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f25a540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02effbc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02f024710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f024470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02f024590> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02f025010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02f0259d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f0248c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02eff9df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f026de0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f025b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f25ac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f053140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f0734d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f0d4260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f0d69c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f0d4380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f099250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e9213a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f0722d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02f027d40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa02f072630> # zipimport: found 30 names in '/tmp/ansible_stat_payload_oenvlos4/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e976fc0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e955eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e955070> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e974e90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e9a29c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e9a2750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e9a2060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e9a2ab0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e977c50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e9a3680> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e9a3860> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e9a3da0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e809bb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e80b7d0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e80c1a0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e80d0a0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e80fdd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e9570b0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e80e090> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e817c50> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e816720> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e816480> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8169f0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e80e5a0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e85fef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e85ffb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e861af0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8618b0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e863fe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8621e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8677a0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e864260> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e868650> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e8686b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e868b60> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8601d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e8f4260> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e8f5280> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e86a9f0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e86bda0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e86a600> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e8f9490> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8fa240> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8f5df0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8fa9f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8fb5f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa02e705f70> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e7032f0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e9d6930> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e9ee600> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e706060> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa02e8faf60> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 34006 1726882663.13847: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882662.7064602-34143-201160923202225/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34006 1726882663.13850: _low_level_execute_command(): starting 34006 1726882663.13853: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882662.7064602-34143-201160923202225/ > /dev/null 2>&1 && sleep 0' 34006 1726882663.14160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34006 1726882663.14172: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34006 1726882663.14185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34006 1726882663.14210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34006 1726882663.14239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 34006 1726882663.14309: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34006 1726882663.14362: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 34006 1726882663.14380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34006 1726882663.14408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34006 1726882663.14491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34006 1726882663.16846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34006 1726882663.16868: stderr chunk (state=3): >>><<< 34006 1726882663.16871: stdout chunk (state=3): >>><<< 34006 1726882663.16884: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34006 1726882663.16891: handler run complete 34006 1726882663.16907: attempt loop complete, returning result 34006 1726882663.16910: _execute() done 34006 1726882663.16912: dumping result to json 34006 1726882663.16917: done dumping result, returning 34006 1726882663.16930: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [12673a56-9f93-11ce-7734-00000000015a] 34006 1726882663.16932: sending task result for task 12673a56-9f93-11ce-7734-00000000015a 34006 1726882663.17017: done sending task result for task 12673a56-9f93-11ce-7734-00000000015a 34006 1726882663.17019: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 34006 1726882663.17095: no more pending results, returning what we have 34006 1726882663.17098: results queue empty 34006 1726882663.17099: checking for any_errors_fatal 34006 1726882663.17105: done checking for any_errors_fatal 34006 1726882663.17105: checking for max_fail_percentage 34006 1726882663.17107: done checking for max_fail_percentage 34006 1726882663.17108: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.17108: done checking to see if all hosts have failed 34006 1726882663.17109: getting the remaining hosts for this loop 34006 1726882663.17110: done getting the remaining hosts for this loop 34006 1726882663.17113: getting the next task for host managed_node3 34006 1726882663.17119: done getting next task for host managed_node3 34006 1726882663.17121: ^ task is: TASK: Set flag to indicate system is ostree 34006 1726882663.17124: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.17129: getting variables 34006 1726882663.17131: in VariableManager get_vars() 34006 1726882663.17161: Calling all_inventory to load vars for managed_node3 34006 1726882663.17163: Calling groups_inventory to load vars for managed_node3 34006 1726882663.17167: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.17176: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.17179: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.17181: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.17373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.17555: done with get_vars() 34006 1726882663.17565: done getting variables 34006 1726882663.17666: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:37:43 -0400 (0:00:00.535) 0:00:02.953 ****** 34006 1726882663.17701: entering _queue_task() for managed_node3/set_fact 34006 1726882663.17703: Creating lock for set_fact 34006 1726882663.18002: worker is 1 (out of 1 available) 34006 1726882663.18015: exiting _queue_task() for managed_node3/set_fact 34006 1726882663.18027: done queuing things up, now waiting for results queue to drain 34006 1726882663.18029: waiting for pending results... 34006 1726882663.18352: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 34006 1726882663.18358: in run() - task 12673a56-9f93-11ce-7734-00000000015b 34006 1726882663.18361: variable 'ansible_search_path' from source: unknown 34006 1726882663.18364: variable 'ansible_search_path' from source: unknown 34006 1726882663.18368: calling self._execute() 34006 1726882663.18598: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.18602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.18605: variable 'omit' from source: magic vars 34006 1726882663.19002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34006 1726882663.19227: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34006 1726882663.19269: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34006 1726882663.19297: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34006 1726882663.19324: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34006 1726882663.19382: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34006 1726882663.19404: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34006 1726882663.19422: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34006 1726882663.19441: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34006 1726882663.19527: Evaluated conditional (not __network_is_ostree is defined): True 34006 1726882663.19531: variable 'omit' from source: magic vars 34006 1726882663.19555: variable 'omit' from source: magic vars 34006 1726882663.19636: variable '__ostree_booted_stat' from source: set_fact 34006 1726882663.19675: variable 'omit' from source: magic vars 34006 1726882663.19697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34006 1726882663.19720: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34006 1726882663.19735: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34006 1726882663.19747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34006 1726882663.19758: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34006 1726882663.19778: variable 'inventory_hostname' from source: host vars for 'managed_node3' 34006 1726882663.19781: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.19783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.19854: Set connection var ansible_pipelining to False 34006 1726882663.19858: Set connection var ansible_shell_executable to /bin/sh 34006 1726882663.19864: Set connection var ansible_timeout to 10 34006 1726882663.19873: Set connection var ansible_connection to ssh 34006 1726882663.19878: Set connection var ansible_module_compression to ZIP_DEFLATED 34006 1726882663.19880: Set connection var ansible_shell_type to sh 34006 1726882663.19899: variable 'ansible_shell_executable' from source: unknown 34006 1726882663.19902: variable 'ansible_connection' from source: unknown 34006 1726882663.19904: variable 'ansible_module_compression' from source: unknown 34006 1726882663.19907: variable 'ansible_shell_type' from source: unknown 34006 1726882663.19909: variable 'ansible_shell_executable' from source: unknown 34006 1726882663.19911: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.19913: variable 'ansible_pipelining' from source: unknown 34006 1726882663.19915: variable 'ansible_timeout' from source: unknown 34006 1726882663.19921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.19994: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34006 1726882663.20000: variable 'omit' from source: magic vars 34006 1726882663.20005: starting attempt loop 34006 1726882663.20009: running the handler 34006 1726882663.20017: handler run complete 34006 1726882663.20026: attempt loop complete, returning result 34006 1726882663.20029: _execute() done 34006 1726882663.20035: dumping result to json 34006 1726882663.20037: done dumping result, returning 34006 1726882663.20047: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [12673a56-9f93-11ce-7734-00000000015b] 34006 1726882663.20050: sending task result for task 12673a56-9f93-11ce-7734-00000000015b 34006 1726882663.20121: done sending task result for task 12673a56-9f93-11ce-7734-00000000015b 34006 1726882663.20124: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 34006 1726882663.20199: no more pending results, returning what we have 34006 1726882663.20201: results queue empty 34006 1726882663.20202: checking for any_errors_fatal 34006 1726882663.20208: done checking for any_errors_fatal 34006 1726882663.20209: checking for max_fail_percentage 34006 1726882663.20210: done checking for max_fail_percentage 34006 1726882663.20211: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.20212: done checking to see if all hosts have failed 34006 1726882663.20213: getting the remaining hosts for this loop 34006 1726882663.20214: done getting the remaining hosts for this loop 34006 1726882663.20216: getting the next task for host managed_node3 34006 1726882663.20223: done getting next task for host managed_node3 34006 1726882663.20225: ^ task is: TASK: Fix CentOS6 Base repo 34006 1726882663.20227: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.20230: getting variables 34006 1726882663.20231: in VariableManager get_vars() 34006 1726882663.20256: Calling all_inventory to load vars for managed_node3 34006 1726882663.20259: Calling groups_inventory to load vars for managed_node3 34006 1726882663.20261: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.20269: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.20271: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.20279: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.20409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.20520: done with get_vars() 34006 1726882663.20526: done getting variables 34006 1726882663.20607: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:37:43 -0400 (0:00:00.029) 0:00:02.982 ****** 34006 1726882663.20626: entering _queue_task() for managed_node3/copy 34006 1726882663.20818: worker is 1 (out of 1 available) 34006 1726882663.20829: exiting _queue_task() for managed_node3/copy 34006 1726882663.20841: done queuing things up, now waiting for results queue to drain 34006 1726882663.20843: waiting for pending results... 34006 1726882663.21027: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 34006 1726882663.21122: in run() - task 12673a56-9f93-11ce-7734-00000000015d 34006 1726882663.21126: variable 'ansible_search_path' from source: unknown 34006 1726882663.21129: variable 'ansible_search_path' from source: unknown 34006 1726882663.21132: calling self._execute() 34006 1726882663.21243: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.21247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.21250: variable 'omit' from source: magic vars 34006 1726882663.21642: variable 'ansible_distribution' from source: facts 34006 1726882663.21646: Evaluated conditional (ansible_distribution == 'CentOS'): True 34006 1726882663.21744: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.21753: Evaluated conditional (ansible_distribution_major_version == '6'): False 34006 1726882663.21756: when evaluation is False, skipping this task 34006 1726882663.21759: _execute() done 34006 1726882663.21761: dumping result to json 34006 1726882663.21764: done dumping result, returning 34006 1726882663.21767: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [12673a56-9f93-11ce-7734-00000000015d] 34006 1726882663.21769: sending task result for task 12673a56-9f93-11ce-7734-00000000015d 34006 1726882663.21865: done sending task result for task 12673a56-9f93-11ce-7734-00000000015d 34006 1726882663.21867: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 34006 1726882663.21960: no more pending results, returning what we have 34006 1726882663.21963: results queue empty 34006 1726882663.21964: checking for any_errors_fatal 34006 1726882663.21967: done checking for any_errors_fatal 34006 1726882663.21968: checking for max_fail_percentage 34006 1726882663.21970: done checking for max_fail_percentage 34006 1726882663.21971: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.21972: done checking to see if all hosts have failed 34006 1726882663.21972: getting the remaining hosts for this loop 34006 1726882663.21973: done getting the remaining hosts for this loop 34006 1726882663.21976: getting the next task for host managed_node3 34006 1726882663.21981: done getting next task for host managed_node3 34006 1726882663.21984: ^ task is: TASK: Include the task 'enable_epel.yml' 34006 1726882663.21986: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.21996: getting variables 34006 1726882663.21998: in VariableManager get_vars() 34006 1726882663.22026: Calling all_inventory to load vars for managed_node3 34006 1726882663.22028: Calling groups_inventory to load vars for managed_node3 34006 1726882663.22031: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.22043: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.22046: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.22049: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.22151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.22285: done with get_vars() 34006 1726882663.22295: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:37:43 -0400 (0:00:00.017) 0:00:02.999 ****** 34006 1726882663.22353: entering _queue_task() for managed_node3/include_tasks 34006 1726882663.22521: worker is 1 (out of 1 available) 34006 1726882663.22533: exiting _queue_task() for managed_node3/include_tasks 34006 1726882663.22543: done queuing things up, now waiting for results queue to drain 34006 1726882663.22544: waiting for pending results... 34006 1726882663.22666: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 34006 1726882663.22728: in run() - task 12673a56-9f93-11ce-7734-00000000015e 34006 1726882663.22738: variable 'ansible_search_path' from source: unknown 34006 1726882663.22741: variable 'ansible_search_path' from source: unknown 34006 1726882663.22763: calling self._execute() 34006 1726882663.22815: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.22819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.22827: variable 'omit' from source: magic vars 34006 1726882663.23132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34006 1726882663.24847: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34006 1726882663.24850: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34006 1726882663.24853: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34006 1726882663.24855: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34006 1726882663.24858: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34006 1726882663.24933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34006 1726882663.25001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34006 1726882663.25032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34006 1726882663.25091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34006 1726882663.25115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34006 1726882663.25236: variable '__network_is_ostree' from source: set_fact 34006 1726882663.25255: Evaluated conditional (not __network_is_ostree | d(false)): True 34006 1726882663.25277: _execute() done 34006 1726882663.25289: dumping result to json 34006 1726882663.25300: done dumping result, returning 34006 1726882663.25310: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [12673a56-9f93-11ce-7734-00000000015e] 34006 1726882663.25377: sending task result for task 12673a56-9f93-11ce-7734-00000000015e 34006 1726882663.25453: done sending task result for task 12673a56-9f93-11ce-7734-00000000015e 34006 1726882663.25457: WORKER PROCESS EXITING 34006 1726882663.25524: no more pending results, returning what we have 34006 1726882663.25529: in VariableManager get_vars() 34006 1726882663.25563: Calling all_inventory to load vars for managed_node3 34006 1726882663.25567: Calling groups_inventory to load vars for managed_node3 34006 1726882663.25570: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.25584: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.25591: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.25603: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.25766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.25877: done with get_vars() 34006 1726882663.25883: variable 'ansible_search_path' from source: unknown 34006 1726882663.25883: variable 'ansible_search_path' from source: unknown 34006 1726882663.25909: we have included files to process 34006 1726882663.25910: generating all_blocks data 34006 1726882663.25912: done generating all_blocks data 34006 1726882663.25915: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34006 1726882663.25916: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34006 1726882663.25918: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34006 1726882663.26505: done processing included file 34006 1726882663.26506: iterating over new_blocks loaded from include file 34006 1726882663.26507: in VariableManager get_vars() 34006 1726882663.26514: done with get_vars() 34006 1726882663.26515: filtering new block on tags 34006 1726882663.26528: done filtering new block on tags 34006 1726882663.26530: in VariableManager get_vars() 34006 1726882663.26536: done with get_vars() 34006 1726882663.26537: filtering new block on tags 34006 1726882663.26543: done filtering new block on tags 34006 1726882663.26544: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 34006 1726882663.26548: extending task lists for all hosts with included blocks 34006 1726882663.26609: done extending task lists 34006 1726882663.26610: done processing included files 34006 1726882663.26611: results queue empty 34006 1726882663.26611: checking for any_errors_fatal 34006 1726882663.26613: done checking for any_errors_fatal 34006 1726882663.26613: checking for max_fail_percentage 34006 1726882663.26614: done checking for max_fail_percentage 34006 1726882663.26615: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.26615: done checking to see if all hosts have failed 34006 1726882663.26615: getting the remaining hosts for this loop 34006 1726882663.26616: done getting the remaining hosts for this loop 34006 1726882663.26618: getting the next task for host managed_node3 34006 1726882663.26620: done getting next task for host managed_node3 34006 1726882663.26622: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 34006 1726882663.26623: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.26625: getting variables 34006 1726882663.26625: in VariableManager get_vars() 34006 1726882663.26631: Calling all_inventory to load vars for managed_node3 34006 1726882663.26632: Calling groups_inventory to load vars for managed_node3 34006 1726882663.26633: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.26636: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.26641: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.26643: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.26733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.26843: done with get_vars() 34006 1726882663.26849: done getting variables 34006 1726882663.26892: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 34006 1726882663.27017: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:37:43 -0400 (0:00:00.046) 0:00:03.046 ****** 34006 1726882663.27047: entering _queue_task() for managed_node3/command 34006 1726882663.27048: Creating lock for command 34006 1726882663.27227: worker is 1 (out of 1 available) 34006 1726882663.27238: exiting _queue_task() for managed_node3/command 34006 1726882663.27248: done queuing things up, now waiting for results queue to drain 34006 1726882663.27249: waiting for pending results... 34006 1726882663.27498: running TaskExecutor() for managed_node3/TASK: Create EPEL 10 34006 1726882663.27504: in run() - task 12673a56-9f93-11ce-7734-000000000178 34006 1726882663.27699: variable 'ansible_search_path' from source: unknown 34006 1726882663.27702: variable 'ansible_search_path' from source: unknown 34006 1726882663.27705: calling self._execute() 34006 1726882663.27708: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.27711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.27713: variable 'omit' from source: magic vars 34006 1726882663.27978: variable 'ansible_distribution' from source: facts 34006 1726882663.28021: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34006 1726882663.28098: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.28129: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34006 1726882663.28133: when evaluation is False, skipping this task 34006 1726882663.28136: _execute() done 34006 1726882663.28138: dumping result to json 34006 1726882663.28140: done dumping result, returning 34006 1726882663.28143: done running TaskExecutor() for managed_node3/TASK: Create EPEL 10 [12673a56-9f93-11ce-7734-000000000178] 34006 1726882663.28149: sending task result for task 12673a56-9f93-11ce-7734-000000000178 34006 1726882663.28282: done sending task result for task 12673a56-9f93-11ce-7734-000000000178 34006 1726882663.28285: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34006 1726882663.28321: no more pending results, returning what we have 34006 1726882663.28323: results queue empty 34006 1726882663.28324: checking for any_errors_fatal 34006 1726882663.28325: done checking for any_errors_fatal 34006 1726882663.28326: checking for max_fail_percentage 34006 1726882663.28327: done checking for max_fail_percentage 34006 1726882663.28328: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.28329: done checking to see if all hosts have failed 34006 1726882663.28329: getting the remaining hosts for this loop 34006 1726882663.28330: done getting the remaining hosts for this loop 34006 1726882663.28333: getting the next task for host managed_node3 34006 1726882663.28338: done getting next task for host managed_node3 34006 1726882663.28368: ^ task is: TASK: Install yum-utils package 34006 1726882663.28372: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.28375: getting variables 34006 1726882663.28377: in VariableManager get_vars() 34006 1726882663.28461: Calling all_inventory to load vars for managed_node3 34006 1726882663.28464: Calling groups_inventory to load vars for managed_node3 34006 1726882663.28467: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.28479: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.28481: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.28484: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.28779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.28984: done with get_vars() 34006 1726882663.28995: done getting variables 34006 1726882663.29087: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:37:43 -0400 (0:00:00.020) 0:00:03.067 ****** 34006 1726882663.29114: entering _queue_task() for managed_node3/package 34006 1726882663.29116: Creating lock for package 34006 1726882663.29351: worker is 1 (out of 1 available) 34006 1726882663.29364: exiting _queue_task() for managed_node3/package 34006 1726882663.29375: done queuing things up, now waiting for results queue to drain 34006 1726882663.29377: waiting for pending results... 34006 1726882663.29694: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 34006 1726882663.29794: in run() - task 12673a56-9f93-11ce-7734-000000000179 34006 1726882663.29798: variable 'ansible_search_path' from source: unknown 34006 1726882663.29801: variable 'ansible_search_path' from source: unknown 34006 1726882663.29803: calling self._execute() 34006 1726882663.29939: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.29949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.29963: variable 'omit' from source: magic vars 34006 1726882663.30313: variable 'ansible_distribution' from source: facts 34006 1726882663.30336: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34006 1726882663.30470: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.30550: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34006 1726882663.30554: when evaluation is False, skipping this task 34006 1726882663.30556: _execute() done 34006 1726882663.30558: dumping result to json 34006 1726882663.30561: done dumping result, returning 34006 1726882663.30563: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [12673a56-9f93-11ce-7734-000000000179] 34006 1726882663.30565: sending task result for task 12673a56-9f93-11ce-7734-000000000179 34006 1726882663.30626: done sending task result for task 12673a56-9f93-11ce-7734-000000000179 34006 1726882663.30629: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34006 1726882663.30727: no more pending results, returning what we have 34006 1726882663.30730: results queue empty 34006 1726882663.30731: checking for any_errors_fatal 34006 1726882663.30734: done checking for any_errors_fatal 34006 1726882663.30735: checking for max_fail_percentage 34006 1726882663.30737: done checking for max_fail_percentage 34006 1726882663.30737: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.30738: done checking to see if all hosts have failed 34006 1726882663.30739: getting the remaining hosts for this loop 34006 1726882663.30740: done getting the remaining hosts for this loop 34006 1726882663.30743: getting the next task for host managed_node3 34006 1726882663.30749: done getting next task for host managed_node3 34006 1726882663.30751: ^ task is: TASK: Enable EPEL 7 34006 1726882663.30754: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.30757: getting variables 34006 1726882663.30759: in VariableManager get_vars() 34006 1726882663.30782: Calling all_inventory to load vars for managed_node3 34006 1726882663.30784: Calling groups_inventory to load vars for managed_node3 34006 1726882663.30787: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.30912: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.30915: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.30919: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.31141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.31351: done with get_vars() 34006 1726882663.31359: done getting variables 34006 1726882663.31416: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:37:43 -0400 (0:00:00.023) 0:00:03.090 ****** 34006 1726882663.31446: entering _queue_task() for managed_node3/command 34006 1726882663.31662: worker is 1 (out of 1 available) 34006 1726882663.31674: exiting _queue_task() for managed_node3/command 34006 1726882663.31684: done queuing things up, now waiting for results queue to drain 34006 1726882663.31686: waiting for pending results... 34006 1726882663.32108: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 34006 1726882663.32113: in run() - task 12673a56-9f93-11ce-7734-00000000017a 34006 1726882663.32116: variable 'ansible_search_path' from source: unknown 34006 1726882663.32119: variable 'ansible_search_path' from source: unknown 34006 1726882663.32121: calling self._execute() 34006 1726882663.32140: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.32151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.32165: variable 'omit' from source: magic vars 34006 1726882663.32528: variable 'ansible_distribution' from source: facts 34006 1726882663.32546: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34006 1726882663.32672: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.32683: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34006 1726882663.32696: when evaluation is False, skipping this task 34006 1726882663.32705: _execute() done 34006 1726882663.32715: dumping result to json 34006 1726882663.32723: done dumping result, returning 34006 1726882663.32733: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [12673a56-9f93-11ce-7734-00000000017a] 34006 1726882663.32742: sending task result for task 12673a56-9f93-11ce-7734-00000000017a 34006 1726882663.33018: done sending task result for task 12673a56-9f93-11ce-7734-00000000017a 34006 1726882663.33021: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34006 1726882663.33055: no more pending results, returning what we have 34006 1726882663.33058: results queue empty 34006 1726882663.33059: checking for any_errors_fatal 34006 1726882663.33064: done checking for any_errors_fatal 34006 1726882663.33064: checking for max_fail_percentage 34006 1726882663.33066: done checking for max_fail_percentage 34006 1726882663.33067: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.33068: done checking to see if all hosts have failed 34006 1726882663.33068: getting the remaining hosts for this loop 34006 1726882663.33069: done getting the remaining hosts for this loop 34006 1726882663.33072: getting the next task for host managed_node3 34006 1726882663.33078: done getting next task for host managed_node3 34006 1726882663.33081: ^ task is: TASK: Enable EPEL 8 34006 1726882663.33084: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.33090: getting variables 34006 1726882663.33092: in VariableManager get_vars() 34006 1726882663.33115: Calling all_inventory to load vars for managed_node3 34006 1726882663.33118: Calling groups_inventory to load vars for managed_node3 34006 1726882663.33121: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.33128: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.33131: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.33134: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.33310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.33518: done with get_vars() 34006 1726882663.33525: done getting variables 34006 1726882663.33568: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:37:43 -0400 (0:00:00.021) 0:00:03.112 ****** 34006 1726882663.33595: entering _queue_task() for managed_node3/command 34006 1726882663.33801: worker is 1 (out of 1 available) 34006 1726882663.33811: exiting _queue_task() for managed_node3/command 34006 1726882663.33821: done queuing things up, now waiting for results queue to drain 34006 1726882663.33823: waiting for pending results... 34006 1726882663.34045: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 34006 1726882663.34153: in run() - task 12673a56-9f93-11ce-7734-00000000017b 34006 1726882663.34174: variable 'ansible_search_path' from source: unknown 34006 1726882663.34181: variable 'ansible_search_path' from source: unknown 34006 1726882663.34220: calling self._execute() 34006 1726882663.34300: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.34311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.34325: variable 'omit' from source: magic vars 34006 1726882663.34675: variable 'ansible_distribution' from source: facts 34006 1726882663.34696: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34006 1726882663.34832: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.34842: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34006 1726882663.34849: when evaluation is False, skipping this task 34006 1726882663.34856: _execute() done 34006 1726882663.34863: dumping result to json 34006 1726882663.34923: done dumping result, returning 34006 1726882663.34927: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [12673a56-9f93-11ce-7734-00000000017b] 34006 1726882663.34929: sending task result for task 12673a56-9f93-11ce-7734-00000000017b 34006 1726882663.34996: done sending task result for task 12673a56-9f93-11ce-7734-00000000017b 34006 1726882663.35000: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34006 1726882663.35072: no more pending results, returning what we have 34006 1726882663.35075: results queue empty 34006 1726882663.35076: checking for any_errors_fatal 34006 1726882663.35083: done checking for any_errors_fatal 34006 1726882663.35084: checking for max_fail_percentage 34006 1726882663.35086: done checking for max_fail_percentage 34006 1726882663.35087: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.35090: done checking to see if all hosts have failed 34006 1726882663.35091: getting the remaining hosts for this loop 34006 1726882663.35092: done getting the remaining hosts for this loop 34006 1726882663.35097: getting the next task for host managed_node3 34006 1726882663.35106: done getting next task for host managed_node3 34006 1726882663.35109: ^ task is: TASK: Enable EPEL 6 34006 1726882663.35113: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.35117: getting variables 34006 1726882663.35118: in VariableManager get_vars() 34006 1726882663.35146: Calling all_inventory to load vars for managed_node3 34006 1726882663.35148: Calling groups_inventory to load vars for managed_node3 34006 1726882663.35152: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.35163: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.35166: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.35168: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.35469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.35667: done with get_vars() 34006 1726882663.35676: done getting variables 34006 1726882663.35731: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:37:43 -0400 (0:00:00.021) 0:00:03.133 ****** 34006 1726882663.35756: entering _queue_task() for managed_node3/copy 34006 1726882663.35956: worker is 1 (out of 1 available) 34006 1726882663.35965: exiting _queue_task() for managed_node3/copy 34006 1726882663.35975: done queuing things up, now waiting for results queue to drain 34006 1726882663.35976: waiting for pending results... 34006 1726882663.36209: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 34006 1726882663.36400: in run() - task 12673a56-9f93-11ce-7734-00000000017d 34006 1726882663.36403: variable 'ansible_search_path' from source: unknown 34006 1726882663.36406: variable 'ansible_search_path' from source: unknown 34006 1726882663.36408: calling self._execute() 34006 1726882663.36428: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.36438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.36452: variable 'omit' from source: magic vars 34006 1726882663.36796: variable 'ansible_distribution' from source: facts 34006 1726882663.36812: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34006 1726882663.36931: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.36942: Evaluated conditional (ansible_distribution_major_version == '6'): False 34006 1726882663.36952: when evaluation is False, skipping this task 34006 1726882663.36961: _execute() done 34006 1726882663.36968: dumping result to json 34006 1726882663.36997: done dumping result, returning 34006 1726882663.37001: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [12673a56-9f93-11ce-7734-00000000017d] 34006 1726882663.37004: sending task result for task 12673a56-9f93-11ce-7734-00000000017d skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 34006 1726882663.37306: no more pending results, returning what we have 34006 1726882663.37309: results queue empty 34006 1726882663.37310: checking for any_errors_fatal 34006 1726882663.37314: done checking for any_errors_fatal 34006 1726882663.37315: checking for max_fail_percentage 34006 1726882663.37316: done checking for max_fail_percentage 34006 1726882663.37317: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.37318: done checking to see if all hosts have failed 34006 1726882663.37318: getting the remaining hosts for this loop 34006 1726882663.37319: done getting the remaining hosts for this loop 34006 1726882663.37323: getting the next task for host managed_node3 34006 1726882663.37330: done getting next task for host managed_node3 34006 1726882663.37333: ^ task is: TASK: Set network provider to 'nm' 34006 1726882663.37335: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.37338: getting variables 34006 1726882663.37339: in VariableManager get_vars() 34006 1726882663.37362: Calling all_inventory to load vars for managed_node3 34006 1726882663.37364: Calling groups_inventory to load vars for managed_node3 34006 1726882663.37367: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.37375: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.37377: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.37380: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.37564: done sending task result for task 12673a56-9f93-11ce-7734-00000000017d 34006 1726882663.37567: WORKER PROCESS EXITING 34006 1726882663.37590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.37780: done with get_vars() 34006 1726882663.37791: done getting variables 34006 1726882663.37844: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:13 Friday 20 September 2024 21:37:43 -0400 (0:00:00.021) 0:00:03.154 ****** 34006 1726882663.37868: entering _queue_task() for managed_node3/set_fact 34006 1726882663.38073: worker is 1 (out of 1 available) 34006 1726882663.38085: exiting _queue_task() for managed_node3/set_fact 34006 1726882663.38301: done queuing things up, now waiting for results queue to drain 34006 1726882663.38303: waiting for pending results... 34006 1726882663.38324: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 34006 1726882663.38410: in run() - task 12673a56-9f93-11ce-7734-000000000007 34006 1726882663.38433: variable 'ansible_search_path' from source: unknown 34006 1726882663.38470: calling self._execute() 34006 1726882663.38553: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.38564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.38579: variable 'omit' from source: magic vars 34006 1726882663.38686: variable 'omit' from source: magic vars 34006 1726882663.38723: variable 'omit' from source: magic vars 34006 1726882663.38768: variable 'omit' from source: magic vars 34006 1726882663.38813: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34006 1726882663.38857: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34006 1726882663.38884: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34006 1726882663.38911: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34006 1726882663.38926: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34006 1726882663.38965: variable 'inventory_hostname' from source: host vars for 'managed_node3' 34006 1726882663.39075: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.39079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.39097: Set connection var ansible_pipelining to False 34006 1726882663.39110: Set connection var ansible_shell_executable to /bin/sh 34006 1726882663.39122: Set connection var ansible_timeout to 10 34006 1726882663.39134: Set connection var ansible_connection to ssh 34006 1726882663.39144: Set connection var ansible_module_compression to ZIP_DEFLATED 34006 1726882663.39150: Set connection var ansible_shell_type to sh 34006 1726882663.39175: variable 'ansible_shell_executable' from source: unknown 34006 1726882663.39192: variable 'ansible_connection' from source: unknown 34006 1726882663.39204: variable 'ansible_module_compression' from source: unknown 34006 1726882663.39212: variable 'ansible_shell_type' from source: unknown 34006 1726882663.39220: variable 'ansible_shell_executable' from source: unknown 34006 1726882663.39228: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.39235: variable 'ansible_pipelining' from source: unknown 34006 1726882663.39242: variable 'ansible_timeout' from source: unknown 34006 1726882663.39251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.39387: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34006 1726882663.39411: variable 'omit' from source: magic vars 34006 1726882663.39422: starting attempt loop 34006 1726882663.39508: running the handler 34006 1726882663.39511: handler run complete 34006 1726882663.39514: attempt loop complete, returning result 34006 1726882663.39516: _execute() done 34006 1726882663.39518: dumping result to json 34006 1726882663.39520: done dumping result, returning 34006 1726882663.39522: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [12673a56-9f93-11ce-7734-000000000007] 34006 1726882663.39524: sending task result for task 12673a56-9f93-11ce-7734-000000000007 34006 1726882663.39581: done sending task result for task 12673a56-9f93-11ce-7734-000000000007 34006 1726882663.39585: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 34006 1726882663.39660: no more pending results, returning what we have 34006 1726882663.39662: results queue empty 34006 1726882663.39663: checking for any_errors_fatal 34006 1726882663.39670: done checking for any_errors_fatal 34006 1726882663.39671: checking for max_fail_percentage 34006 1726882663.39672: done checking for max_fail_percentage 34006 1726882663.39673: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.39674: done checking to see if all hosts have failed 34006 1726882663.39674: getting the remaining hosts for this loop 34006 1726882663.39676: done getting the remaining hosts for this loop 34006 1726882663.39679: getting the next task for host managed_node3 34006 1726882663.39685: done getting next task for host managed_node3 34006 1726882663.39690: ^ task is: TASK: meta (flush_handlers) 34006 1726882663.39692: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.39698: getting variables 34006 1726882663.39700: in VariableManager get_vars() 34006 1726882663.39727: Calling all_inventory to load vars for managed_node3 34006 1726882663.39730: Calling groups_inventory to load vars for managed_node3 34006 1726882663.39733: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.39744: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.39747: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.39750: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.40016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.40247: done with get_vars() 34006 1726882663.40256: done getting variables 34006 1726882663.40317: in VariableManager get_vars() 34006 1726882663.40326: Calling all_inventory to load vars for managed_node3 34006 1726882663.40328: Calling groups_inventory to load vars for managed_node3 34006 1726882663.40330: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.40334: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.40336: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.40339: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.40473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.40683: done with get_vars() 34006 1726882663.40699: done queuing things up, now waiting for results queue to drain 34006 1726882663.40701: results queue empty 34006 1726882663.40702: checking for any_errors_fatal 34006 1726882663.40704: done checking for any_errors_fatal 34006 1726882663.40705: checking for max_fail_percentage 34006 1726882663.40706: done checking for max_fail_percentage 34006 1726882663.40706: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.40707: done checking to see if all hosts have failed 34006 1726882663.40708: getting the remaining hosts for this loop 34006 1726882663.40709: done getting the remaining hosts for this loop 34006 1726882663.40711: getting the next task for host managed_node3 34006 1726882663.40714: done getting next task for host managed_node3 34006 1726882663.40715: ^ task is: TASK: meta (flush_handlers) 34006 1726882663.40716: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.40723: getting variables 34006 1726882663.40724: in VariableManager get_vars() 34006 1726882663.40731: Calling all_inventory to load vars for managed_node3 34006 1726882663.40733: Calling groups_inventory to load vars for managed_node3 34006 1726882663.40735: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.40739: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.40742: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.40744: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.40878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.41064: done with get_vars() 34006 1726882663.41071: done getting variables 34006 1726882663.41117: in VariableManager get_vars() 34006 1726882663.41125: Calling all_inventory to load vars for managed_node3 34006 1726882663.41127: Calling groups_inventory to load vars for managed_node3 34006 1726882663.41129: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.41133: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.41136: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.41138: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.41273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.41479: done with get_vars() 34006 1726882663.41492: done queuing things up, now waiting for results queue to drain 34006 1726882663.41496: results queue empty 34006 1726882663.41497: checking for any_errors_fatal 34006 1726882663.41498: done checking for any_errors_fatal 34006 1726882663.41498: checking for max_fail_percentage 34006 1726882663.41499: done checking for max_fail_percentage 34006 1726882663.41500: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.41500: done checking to see if all hosts have failed 34006 1726882663.41501: getting the remaining hosts for this loop 34006 1726882663.41502: done getting the remaining hosts for this loop 34006 1726882663.41504: getting the next task for host managed_node3 34006 1726882663.41507: done getting next task for host managed_node3 34006 1726882663.41508: ^ task is: None 34006 1726882663.41509: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.41510: done queuing things up, now waiting for results queue to drain 34006 1726882663.41511: results queue empty 34006 1726882663.41511: checking for any_errors_fatal 34006 1726882663.41512: done checking for any_errors_fatal 34006 1726882663.41512: checking for max_fail_percentage 34006 1726882663.41513: done checking for max_fail_percentage 34006 1726882663.41514: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.41515: done checking to see if all hosts have failed 34006 1726882663.41516: getting the next task for host managed_node3 34006 1726882663.41518: done getting next task for host managed_node3 34006 1726882663.41519: ^ task is: None 34006 1726882663.41520: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.41556: in VariableManager get_vars() 34006 1726882663.41585: done with get_vars() 34006 1726882663.41603: in VariableManager get_vars() 34006 1726882663.41624: done with get_vars() 34006 1726882663.41629: variable 'omit' from source: magic vars 34006 1726882663.41658: in VariableManager get_vars() 34006 1726882663.41676: done with get_vars() 34006 1726882663.41703: variable 'omit' from source: magic vars PLAY [Play for testing wireless connection] ************************************ 34006 1726882663.42419: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 34006 1726882663.42442: getting the remaining hosts for this loop 34006 1726882663.42443: done getting the remaining hosts for this loop 34006 1726882663.42445: getting the next task for host managed_node3 34006 1726882663.42448: done getting next task for host managed_node3 34006 1726882663.42449: ^ task is: TASK: Gathering Facts 34006 1726882663.42450: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.42452: getting variables 34006 1726882663.42453: in VariableManager get_vars() 34006 1726882663.42467: Calling all_inventory to load vars for managed_node3 34006 1726882663.42469: Calling groups_inventory to load vars for managed_node3 34006 1726882663.42471: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.42476: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.42490: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.42496: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.42630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.42821: done with get_vars() 34006 1726882663.42828: done getting variables 34006 1726882663.42863: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:3 Friday 20 September 2024 21:37:43 -0400 (0:00:00.050) 0:00:03.205 ****** 34006 1726882663.42883: entering _queue_task() for managed_node3/gather_facts 34006 1726882663.43098: worker is 1 (out of 1 available) 34006 1726882663.43109: exiting _queue_task() for managed_node3/gather_facts 34006 1726882663.43119: done queuing things up, now waiting for results queue to drain 34006 1726882663.43120: waiting for pending results... 34006 1726882663.43508: running TaskExecutor() for managed_node3/TASK: Gathering Facts 34006 1726882663.43513: in run() - task 12673a56-9f93-11ce-7734-0000000001a3 34006 1726882663.43516: variable 'ansible_search_path' from source: unknown 34006 1726882663.43519: calling self._execute() 34006 1726882663.43556: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.43567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.43580: variable 'omit' from source: magic vars 34006 1726882663.43916: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.43936: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.44057: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.44067: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.44074: when evaluation is False, skipping this task 34006 1726882663.44080: _execute() done 34006 1726882663.44087: dumping result to json 34006 1726882663.44099: done dumping result, returning 34006 1726882663.44109: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [12673a56-9f93-11ce-7734-0000000001a3] 34006 1726882663.44119: sending task result for task 12673a56-9f93-11ce-7734-0000000001a3 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882663.44241: no more pending results, returning what we have 34006 1726882663.44244: results queue empty 34006 1726882663.44245: checking for any_errors_fatal 34006 1726882663.44247: done checking for any_errors_fatal 34006 1726882663.44247: checking for max_fail_percentage 34006 1726882663.44249: done checking for max_fail_percentage 34006 1726882663.44250: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.44251: done checking to see if all hosts have failed 34006 1726882663.44251: getting the remaining hosts for this loop 34006 1726882663.44253: done getting the remaining hosts for this loop 34006 1726882663.44256: getting the next task for host managed_node3 34006 1726882663.44262: done getting next task for host managed_node3 34006 1726882663.44264: ^ task is: TASK: meta (flush_handlers) 34006 1726882663.44266: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.44270: getting variables 34006 1726882663.44271: in VariableManager get_vars() 34006 1726882663.44320: Calling all_inventory to load vars for managed_node3 34006 1726882663.44323: Calling groups_inventory to load vars for managed_node3 34006 1726882663.44325: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.44336: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.44339: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.44343: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.44744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.44929: done with get_vars() 34006 1726882663.44937: done getting variables 34006 1726882663.44960: done sending task result for task 12673a56-9f93-11ce-7734-0000000001a3 34006 1726882663.44963: WORKER PROCESS EXITING 34006 1726882663.45001: in VariableManager get_vars() 34006 1726882663.45015: Calling all_inventory to load vars for managed_node3 34006 1726882663.45017: Calling groups_inventory to load vars for managed_node3 34006 1726882663.45018: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.45022: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.45024: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.45026: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.45240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.45462: done with get_vars() 34006 1726882663.45474: done queuing things up, now waiting for results queue to drain 34006 1726882663.45475: results queue empty 34006 1726882663.45476: checking for any_errors_fatal 34006 1726882663.45478: done checking for any_errors_fatal 34006 1726882663.45479: checking for max_fail_percentage 34006 1726882663.45479: done checking for max_fail_percentage 34006 1726882663.45480: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.45481: done checking to see if all hosts have failed 34006 1726882663.45481: getting the remaining hosts for this loop 34006 1726882663.45482: done getting the remaining hosts for this loop 34006 1726882663.45484: getting the next task for host managed_node3 34006 1726882663.45487: done getting next task for host managed_node3 34006 1726882663.45489: ^ task is: TASK: INIT: wireless tests 34006 1726882663.45490: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.45492: getting variables 34006 1726882663.45495: in VariableManager get_vars() 34006 1726882663.45509: Calling all_inventory to load vars for managed_node3 34006 1726882663.45511: Calling groups_inventory to load vars for managed_node3 34006 1726882663.45514: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.45518: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.45520: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.45523: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.45711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.45930: done with get_vars() 34006 1726882663.45938: done getting variables 34006 1726882663.46314: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT: wireless tests] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:8 Friday 20 September 2024 21:37:43 -0400 (0:00:00.034) 0:00:03.239 ****** 34006 1726882663.46336: entering _queue_task() for managed_node3/debug 34006 1726882663.46338: Creating lock for debug 34006 1726882663.46572: worker is 1 (out of 1 available) 34006 1726882663.46582: exiting _queue_task() for managed_node3/debug 34006 1726882663.46596: done queuing things up, now waiting for results queue to drain 34006 1726882663.46598: waiting for pending results... 34006 1726882663.46933: running TaskExecutor() for managed_node3/TASK: INIT: wireless tests 34006 1726882663.47022: in run() - task 12673a56-9f93-11ce-7734-00000000000b 34006 1726882663.47040: variable 'ansible_search_path' from source: unknown 34006 1726882663.47088: calling self._execute() 34006 1726882663.47169: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.47187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.47267: variable 'omit' from source: magic vars 34006 1726882663.47574: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.47595: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.47726: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.47737: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.47744: when evaluation is False, skipping this task 34006 1726882663.47751: _execute() done 34006 1726882663.47758: dumping result to json 34006 1726882663.47764: done dumping result, returning 34006 1726882663.47774: done running TaskExecutor() for managed_node3/TASK: INIT: wireless tests [12673a56-9f93-11ce-7734-00000000000b] 34006 1726882663.47782: sending task result for task 12673a56-9f93-11ce-7734-00000000000b skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882663.47992: no more pending results, returning what we have 34006 1726882663.47998: results queue empty 34006 1726882663.47999: checking for any_errors_fatal 34006 1726882663.48001: done checking for any_errors_fatal 34006 1726882663.48002: checking for max_fail_percentage 34006 1726882663.48003: done checking for max_fail_percentage 34006 1726882663.48004: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.48005: done checking to see if all hosts have failed 34006 1726882663.48005: getting the remaining hosts for this loop 34006 1726882663.48007: done getting the remaining hosts for this loop 34006 1726882663.48010: getting the next task for host managed_node3 34006 1726882663.48018: done getting next task for host managed_node3 34006 1726882663.48021: ^ task is: TASK: Include the task 'setup_mock_wifi.yml' 34006 1726882663.48023: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.48027: getting variables 34006 1726882663.48029: in VariableManager get_vars() 34006 1726882663.48083: Calling all_inventory to load vars for managed_node3 34006 1726882663.48086: Calling groups_inventory to load vars for managed_node3 34006 1726882663.48089: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.48103: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.48106: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.48109: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.48578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.49006: done with get_vars() 34006 1726882663.49016: done getting variables TASK [Include the task 'setup_mock_wifi.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:11 Friday 20 September 2024 21:37:43 -0400 (0:00:00.029) 0:00:03.268 ****** 34006 1726882663.49245: entering _queue_task() for managed_node3/include_tasks 34006 1726882663.49411: done sending task result for task 12673a56-9f93-11ce-7734-00000000000b 34006 1726882663.49415: WORKER PROCESS EXITING 34006 1726882663.49839: worker is 1 (out of 1 available) 34006 1726882663.49850: exiting _queue_task() for managed_node3/include_tasks 34006 1726882663.49861: done queuing things up, now waiting for results queue to drain 34006 1726882663.49863: waiting for pending results... 34006 1726882663.50423: running TaskExecutor() for managed_node3/TASK: Include the task 'setup_mock_wifi.yml' 34006 1726882663.50428: in run() - task 12673a56-9f93-11ce-7734-00000000000c 34006 1726882663.50448: variable 'ansible_search_path' from source: unknown 34006 1726882663.50553: calling self._execute() 34006 1726882663.50663: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.50749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.50764: variable 'omit' from source: magic vars 34006 1726882663.51470: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.51519: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.51942: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.51946: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.51949: when evaluation is False, skipping this task 34006 1726882663.51951: _execute() done 34006 1726882663.51954: dumping result to json 34006 1726882663.51957: done dumping result, returning 34006 1726882663.51959: done running TaskExecutor() for managed_node3/TASK: Include the task 'setup_mock_wifi.yml' [12673a56-9f93-11ce-7734-00000000000c] 34006 1726882663.51962: sending task result for task 12673a56-9f93-11ce-7734-00000000000c 34006 1726882663.52033: done sending task result for task 12673a56-9f93-11ce-7734-00000000000c 34006 1726882663.52036: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882663.52089: no more pending results, returning what we have 34006 1726882663.52095: results queue empty 34006 1726882663.52096: checking for any_errors_fatal 34006 1726882663.52103: done checking for any_errors_fatal 34006 1726882663.52103: checking for max_fail_percentage 34006 1726882663.52105: done checking for max_fail_percentage 34006 1726882663.52106: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.52107: done checking to see if all hosts have failed 34006 1726882663.52108: getting the remaining hosts for this loop 34006 1726882663.52109: done getting the remaining hosts for this loop 34006 1726882663.52113: getting the next task for host managed_node3 34006 1726882663.52119: done getting next task for host managed_node3 34006 1726882663.52122: ^ task is: TASK: Copy client certs 34006 1726882663.52124: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.52127: getting variables 34006 1726882663.52129: in VariableManager get_vars() 34006 1726882663.52296: Calling all_inventory to load vars for managed_node3 34006 1726882663.52299: Calling groups_inventory to load vars for managed_node3 34006 1726882663.52302: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.52316: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.52319: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.52322: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.52886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.53276: done with get_vars() 34006 1726882663.53287: done getting variables 34006 1726882663.53342: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Copy client certs] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:13 Friday 20 September 2024 21:37:43 -0400 (0:00:00.041) 0:00:03.310 ****** 34006 1726882663.53373: entering _queue_task() for managed_node3/copy 34006 1726882663.53702: worker is 1 (out of 1 available) 34006 1726882663.53711: exiting _queue_task() for managed_node3/copy 34006 1726882663.53720: done queuing things up, now waiting for results queue to drain 34006 1726882663.53722: waiting for pending results... 34006 1726882663.53971: running TaskExecutor() for managed_node3/TASK: Copy client certs 34006 1726882663.54335: in run() - task 12673a56-9f93-11ce-7734-00000000000d 34006 1726882663.54339: variable 'ansible_search_path' from source: unknown 34006 1726882663.54615: Loaded config def from plugin (lookup/items) 34006 1726882663.54672: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 34006 1726882663.54820: variable 'omit' from source: magic vars 34006 1726882663.55255: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.55272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.55287: variable 'omit' from source: magic vars 34006 1726882663.56315: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.56330: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.56673: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.56858: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.56861: when evaluation is False, skipping this task 34006 1726882663.56863: variable 'item' from source: unknown 34006 1726882663.57025: variable 'item' from source: unknown skipping: [managed_node3] => (item=client.key) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "client.key", "skip_reason": "Conditional result was False" } 34006 1726882663.57511: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.57515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.57517: variable 'omit' from source: magic vars 34006 1726882663.57942: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.57945: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.58098: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.58161: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.58175: when evaluation is False, skipping this task 34006 1726882663.58208: variable 'item' from source: unknown 34006 1726882663.58334: variable 'item' from source: unknown skipping: [managed_node3] => (item=client.pem) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "client.pem", "skip_reason": "Conditional result was False" } 34006 1726882663.58898: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.58904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.58907: variable 'omit' from source: magic vars 34006 1726882663.58919: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.58931: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.59159: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.59313: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.59316: when evaluation is False, skipping this task 34006 1726882663.59319: variable 'item' from source: unknown 34006 1726882663.59478: variable 'item' from source: unknown skipping: [managed_node3] => (item=cacert.pem) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "cacert.pem", "skip_reason": "Conditional result was False" } 34006 1726882663.59538: dumping result to json 34006 1726882663.59541: done dumping result, returning 34006 1726882663.59543: done running TaskExecutor() for managed_node3/TASK: Copy client certs [12673a56-9f93-11ce-7734-00000000000d] 34006 1726882663.59794: sending task result for task 12673a56-9f93-11ce-7734-00000000000d 34006 1726882663.59836: done sending task result for task 12673a56-9f93-11ce-7734-00000000000d 34006 1726882663.59839: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false } MSG: All items skipped 34006 1726882663.59932: no more pending results, returning what we have 34006 1726882663.59936: results queue empty 34006 1726882663.59937: checking for any_errors_fatal 34006 1726882663.59941: done checking for any_errors_fatal 34006 1726882663.59942: checking for max_fail_percentage 34006 1726882663.59944: done checking for max_fail_percentage 34006 1726882663.59945: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.59945: done checking to see if all hosts have failed 34006 1726882663.59946: getting the remaining hosts for this loop 34006 1726882663.59948: done getting the remaining hosts for this loop 34006 1726882663.59951: getting the next task for host managed_node3 34006 1726882663.59958: done getting next task for host managed_node3 34006 1726882663.59961: ^ task is: TASK: TEST: wireless connection with WPA-PSK 34006 1726882663.59963: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.59966: getting variables 34006 1726882663.59968: in VariableManager get_vars() 34006 1726882663.60021: Calling all_inventory to load vars for managed_node3 34006 1726882663.60024: Calling groups_inventory to load vars for managed_node3 34006 1726882663.60027: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.60039: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.60042: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.60045: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.60630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.61031: done with get_vars() 34006 1726882663.61198: done getting variables 34006 1726882663.61254: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: wireless connection with WPA-PSK] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:24 Friday 20 September 2024 21:37:43 -0400 (0:00:00.079) 0:00:03.389 ****** 34006 1726882663.61278: entering _queue_task() for managed_node3/debug 34006 1726882663.61661: worker is 1 (out of 1 available) 34006 1726882663.61673: exiting _queue_task() for managed_node3/debug 34006 1726882663.61685: done queuing things up, now waiting for results queue to drain 34006 1726882663.61686: waiting for pending results... 34006 1726882663.62122: running TaskExecutor() for managed_node3/TASK: TEST: wireless connection with WPA-PSK 34006 1726882663.62399: in run() - task 12673a56-9f93-11ce-7734-00000000000f 34006 1726882663.62403: variable 'ansible_search_path' from source: unknown 34006 1726882663.62406: calling self._execute() 34006 1726882663.62678: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.62681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.62684: variable 'omit' from source: magic vars 34006 1726882663.63196: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.63399: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.63599: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.63603: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.63607: when evaluation is False, skipping this task 34006 1726882663.63610: _execute() done 34006 1726882663.63612: dumping result to json 34006 1726882663.63615: done dumping result, returning 34006 1726882663.63618: done running TaskExecutor() for managed_node3/TASK: TEST: wireless connection with WPA-PSK [12673a56-9f93-11ce-7734-00000000000f] 34006 1726882663.63621: sending task result for task 12673a56-9f93-11ce-7734-00000000000f 34006 1726882663.63684: done sending task result for task 12673a56-9f93-11ce-7734-00000000000f 34006 1726882663.63687: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882663.63731: no more pending results, returning what we have 34006 1726882663.63735: results queue empty 34006 1726882663.63736: checking for any_errors_fatal 34006 1726882663.63745: done checking for any_errors_fatal 34006 1726882663.63745: checking for max_fail_percentage 34006 1726882663.63747: done checking for max_fail_percentage 34006 1726882663.63747: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.63748: done checking to see if all hosts have failed 34006 1726882663.63749: getting the remaining hosts for this loop 34006 1726882663.63750: done getting the remaining hosts for this loop 34006 1726882663.63754: getting the next task for host managed_node3 34006 1726882663.63761: done getting next task for host managed_node3 34006 1726882663.63767: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34006 1726882663.63770: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.63784: getting variables 34006 1726882663.63786: in VariableManager get_vars() 34006 1726882663.63835: Calling all_inventory to load vars for managed_node3 34006 1726882663.63838: Calling groups_inventory to load vars for managed_node3 34006 1726882663.63840: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.63850: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.63853: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.63855: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.64224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.64636: done with get_vars() 34006 1726882663.64646: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:37:43 -0400 (0:00:00.036) 0:00:03.425 ****** 34006 1726882663.64939: entering _queue_task() for managed_node3/include_tasks 34006 1726882663.65364: worker is 1 (out of 1 available) 34006 1726882663.65375: exiting _queue_task() for managed_node3/include_tasks 34006 1726882663.65386: done queuing things up, now waiting for results queue to drain 34006 1726882663.65390: waiting for pending results... 34006 1726882663.65974: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34006 1726882663.65997: in run() - task 12673a56-9f93-11ce-7734-000000000017 34006 1726882663.66182: variable 'ansible_search_path' from source: unknown 34006 1726882663.66185: variable 'ansible_search_path' from source: unknown 34006 1726882663.66188: calling self._execute() 34006 1726882663.66303: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.66316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.66329: variable 'omit' from source: magic vars 34006 1726882663.66961: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.66975: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.67328: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.67331: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.67334: when evaluation is False, skipping this task 34006 1726882663.67336: _execute() done 34006 1726882663.67338: dumping result to json 34006 1726882663.67340: done dumping result, returning 34006 1726882663.67343: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-11ce-7734-000000000017] 34006 1726882663.67345: sending task result for task 12673a56-9f93-11ce-7734-000000000017 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882663.67461: no more pending results, returning what we have 34006 1726882663.67465: results queue empty 34006 1726882663.67466: checking for any_errors_fatal 34006 1726882663.67471: done checking for any_errors_fatal 34006 1726882663.67471: checking for max_fail_percentage 34006 1726882663.67473: done checking for max_fail_percentage 34006 1726882663.67474: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.67474: done checking to see if all hosts have failed 34006 1726882663.67475: getting the remaining hosts for this loop 34006 1726882663.67477: done getting the remaining hosts for this loop 34006 1726882663.67481: getting the next task for host managed_node3 34006 1726882663.67490: done getting next task for host managed_node3 34006 1726882663.67495: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34006 1726882663.67498: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.67518: getting variables 34006 1726882663.67520: in VariableManager get_vars() 34006 1726882663.67564: Calling all_inventory to load vars for managed_node3 34006 1726882663.67567: Calling groups_inventory to load vars for managed_node3 34006 1726882663.67569: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.67580: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.67582: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.67585: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.68399: done sending task result for task 12673a56-9f93-11ce-7734-000000000017 34006 1726882663.68403: WORKER PROCESS EXITING 34006 1726882663.68484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.68911: done with get_vars() 34006 1726882663.68920: done getting variables 34006 1726882663.69141: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:37:43 -0400 (0:00:00.042) 0:00:03.467 ****** 34006 1726882663.69170: entering _queue_task() for managed_node3/debug 34006 1726882663.69721: worker is 1 (out of 1 available) 34006 1726882663.69735: exiting _queue_task() for managed_node3/debug 34006 1726882663.69746: done queuing things up, now waiting for results queue to drain 34006 1726882663.69747: waiting for pending results... 34006 1726882663.70101: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 34006 1726882663.70299: in run() - task 12673a56-9f93-11ce-7734-000000000018 34006 1726882663.70320: variable 'ansible_search_path' from source: unknown 34006 1726882663.70328: variable 'ansible_search_path' from source: unknown 34006 1726882663.70377: calling self._execute() 34006 1726882663.70478: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.70492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.70511: variable 'omit' from source: magic vars 34006 1726882663.70999: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.71018: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.71134: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.71143: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.71149: when evaluation is False, skipping this task 34006 1726882663.71154: _execute() done 34006 1726882663.71159: dumping result to json 34006 1726882663.71165: done dumping result, returning 34006 1726882663.71172: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-11ce-7734-000000000018] 34006 1726882663.71179: sending task result for task 12673a56-9f93-11ce-7734-000000000018 skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882663.71434: no more pending results, returning what we have 34006 1726882663.71437: results queue empty 34006 1726882663.71438: checking for any_errors_fatal 34006 1726882663.71443: done checking for any_errors_fatal 34006 1726882663.71444: checking for max_fail_percentage 34006 1726882663.71445: done checking for max_fail_percentage 34006 1726882663.71446: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.71447: done checking to see if all hosts have failed 34006 1726882663.71447: getting the remaining hosts for this loop 34006 1726882663.71449: done getting the remaining hosts for this loop 34006 1726882663.71452: getting the next task for host managed_node3 34006 1726882663.71457: done getting next task for host managed_node3 34006 1726882663.71460: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34006 1726882663.71463: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.71480: getting variables 34006 1726882663.71482: in VariableManager get_vars() 34006 1726882663.71521: Calling all_inventory to load vars for managed_node3 34006 1726882663.71524: Calling groups_inventory to load vars for managed_node3 34006 1726882663.71526: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.71535: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.71537: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.71540: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.71720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.72171: done with get_vars() 34006 1726882663.72180: done getting variables 34006 1726882663.72224: done sending task result for task 12673a56-9f93-11ce-7734-000000000018 34006 1726882663.72228: WORKER PROCESS EXITING 34006 1726882663.72319: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:37:43 -0400 (0:00:00.031) 0:00:03.499 ****** 34006 1726882663.72348: entering _queue_task() for managed_node3/fail 34006 1726882663.72350: Creating lock for fail 34006 1726882663.72578: worker is 1 (out of 1 available) 34006 1726882663.72797: exiting _queue_task() for managed_node3/fail 34006 1726882663.72807: done queuing things up, now waiting for results queue to drain 34006 1726882663.72808: waiting for pending results... 34006 1726882663.72845: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34006 1726882663.72967: in run() - task 12673a56-9f93-11ce-7734-000000000019 34006 1726882663.72986: variable 'ansible_search_path' from source: unknown 34006 1726882663.73001: variable 'ansible_search_path' from source: unknown 34006 1726882663.73042: calling self._execute() 34006 1726882663.73121: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.73132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.73148: variable 'omit' from source: magic vars 34006 1726882663.73497: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.73513: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.73633: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.73643: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.73649: when evaluation is False, skipping this task 34006 1726882663.73683: _execute() done 34006 1726882663.73686: dumping result to json 34006 1726882663.73691: done dumping result, returning 34006 1726882663.73695: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-11ce-7734-000000000019] 34006 1726882663.73698: sending task result for task 12673a56-9f93-11ce-7734-000000000019 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882663.73834: no more pending results, returning what we have 34006 1726882663.73838: results queue empty 34006 1726882663.73839: checking for any_errors_fatal 34006 1726882663.73844: done checking for any_errors_fatal 34006 1726882663.73845: checking for max_fail_percentage 34006 1726882663.73847: done checking for max_fail_percentage 34006 1726882663.73847: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.73848: done checking to see if all hosts have failed 34006 1726882663.73849: getting the remaining hosts for this loop 34006 1726882663.73850: done getting the remaining hosts for this loop 34006 1726882663.73854: getting the next task for host managed_node3 34006 1726882663.73861: done getting next task for host managed_node3 34006 1726882663.73864: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34006 1726882663.73867: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.73881: getting variables 34006 1726882663.73882: in VariableManager get_vars() 34006 1726882663.74145: Calling all_inventory to load vars for managed_node3 34006 1726882663.74148: Calling groups_inventory to load vars for managed_node3 34006 1726882663.74157: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.74163: done sending task result for task 12673a56-9f93-11ce-7734-000000000019 34006 1726882663.74165: WORKER PROCESS EXITING 34006 1726882663.74173: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.74176: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.74179: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.74432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.74839: done with get_vars() 34006 1726882663.74848: done getting variables 34006 1726882663.74909: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:37:43 -0400 (0:00:00.025) 0:00:03.525 ****** 34006 1726882663.74937: entering _queue_task() for managed_node3/fail 34006 1726882663.75355: worker is 1 (out of 1 available) 34006 1726882663.75368: exiting _queue_task() for managed_node3/fail 34006 1726882663.75379: done queuing things up, now waiting for results queue to drain 34006 1726882663.75381: waiting for pending results... 34006 1726882663.75870: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34006 1726882663.76010: in run() - task 12673a56-9f93-11ce-7734-00000000001a 34006 1726882663.76035: variable 'ansible_search_path' from source: unknown 34006 1726882663.76043: variable 'ansible_search_path' from source: unknown 34006 1726882663.76082: calling self._execute() 34006 1726882663.76167: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.76178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.76198: variable 'omit' from source: magic vars 34006 1726882663.76562: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.76701: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.76705: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.76714: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.76721: when evaluation is False, skipping this task 34006 1726882663.76729: _execute() done 34006 1726882663.76736: dumping result to json 34006 1726882663.76743: done dumping result, returning 34006 1726882663.76753: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-11ce-7734-00000000001a] 34006 1726882663.76764: sending task result for task 12673a56-9f93-11ce-7734-00000000001a 34006 1726882663.77001: done sending task result for task 12673a56-9f93-11ce-7734-00000000001a 34006 1726882663.77004: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882663.77038: no more pending results, returning what we have 34006 1726882663.77041: results queue empty 34006 1726882663.77042: checking for any_errors_fatal 34006 1726882663.77046: done checking for any_errors_fatal 34006 1726882663.77047: checking for max_fail_percentage 34006 1726882663.77048: done checking for max_fail_percentage 34006 1726882663.77049: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.77050: done checking to see if all hosts have failed 34006 1726882663.77050: getting the remaining hosts for this loop 34006 1726882663.77051: done getting the remaining hosts for this loop 34006 1726882663.77055: getting the next task for host managed_node3 34006 1726882663.77060: done getting next task for host managed_node3 34006 1726882663.77063: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34006 1726882663.77066: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.77079: getting variables 34006 1726882663.77081: in VariableManager get_vars() 34006 1726882663.77124: Calling all_inventory to load vars for managed_node3 34006 1726882663.77127: Calling groups_inventory to load vars for managed_node3 34006 1726882663.77129: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.77137: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.77140: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.77143: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.77358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.77556: done with get_vars() 34006 1726882663.77566: done getting variables 34006 1726882663.77625: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:37:43 -0400 (0:00:00.027) 0:00:03.552 ****** 34006 1726882663.77654: entering _queue_task() for managed_node3/fail 34006 1726882663.77992: worker is 1 (out of 1 available) 34006 1726882663.78411: exiting _queue_task() for managed_node3/fail 34006 1726882663.78421: done queuing things up, now waiting for results queue to drain 34006 1726882663.78422: waiting for pending results... 34006 1726882663.78439: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34006 1726882663.78653: in run() - task 12673a56-9f93-11ce-7734-00000000001b 34006 1726882663.78771: variable 'ansible_search_path' from source: unknown 34006 1726882663.78899: variable 'ansible_search_path' from source: unknown 34006 1726882663.78912: calling self._execute() 34006 1726882663.79001: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.79013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.79026: variable 'omit' from source: magic vars 34006 1726882663.79390: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.79528: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.79753: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.79813: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.79822: when evaluation is False, skipping this task 34006 1726882663.79829: _execute() done 34006 1726882663.79839: dumping result to json 34006 1726882663.79847: done dumping result, returning 34006 1726882663.79858: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-11ce-7734-00000000001b] 34006 1726882663.79866: sending task result for task 12673a56-9f93-11ce-7734-00000000001b 34006 1726882663.80122: done sending task result for task 12673a56-9f93-11ce-7734-00000000001b 34006 1726882663.80125: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882663.80207: no more pending results, returning what we have 34006 1726882663.80211: results queue empty 34006 1726882663.80211: checking for any_errors_fatal 34006 1726882663.80215: done checking for any_errors_fatal 34006 1726882663.80216: checking for max_fail_percentage 34006 1726882663.80217: done checking for max_fail_percentage 34006 1726882663.80218: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.80219: done checking to see if all hosts have failed 34006 1726882663.80220: getting the remaining hosts for this loop 34006 1726882663.80221: done getting the remaining hosts for this loop 34006 1726882663.80224: getting the next task for host managed_node3 34006 1726882663.80231: done getting next task for host managed_node3 34006 1726882663.80234: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34006 1726882663.80237: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.80251: getting variables 34006 1726882663.80253: in VariableManager get_vars() 34006 1726882663.80305: Calling all_inventory to load vars for managed_node3 34006 1726882663.80308: Calling groups_inventory to load vars for managed_node3 34006 1726882663.80310: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.80321: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.80324: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.80327: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.80844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.81263: done with get_vars() 34006 1726882663.81273: done getting variables 34006 1726882663.81367: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:37:43 -0400 (0:00:00.037) 0:00:03.590 ****** 34006 1726882663.81400: entering _queue_task() for managed_node3/dnf 34006 1726882663.81627: worker is 1 (out of 1 available) 34006 1726882663.81640: exiting _queue_task() for managed_node3/dnf 34006 1726882663.81651: done queuing things up, now waiting for results queue to drain 34006 1726882663.81652: waiting for pending results... 34006 1726882663.81907: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34006 1726882663.82037: in run() - task 12673a56-9f93-11ce-7734-00000000001c 34006 1726882663.82057: variable 'ansible_search_path' from source: unknown 34006 1726882663.82065: variable 'ansible_search_path' from source: unknown 34006 1726882663.82108: calling self._execute() 34006 1726882663.82240: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.82244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.82246: variable 'omit' from source: magic vars 34006 1726882663.82537: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.82558: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.82749: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.82770: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.82818: when evaluation is False, skipping this task 34006 1726882663.82822: _execute() done 34006 1726882663.82897: dumping result to json 34006 1726882663.82900: done dumping result, returning 34006 1726882663.82904: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-11ce-7734-00000000001c] 34006 1726882663.82906: sending task result for task 12673a56-9f93-11ce-7734-00000000001c 34006 1726882663.82973: done sending task result for task 12673a56-9f93-11ce-7734-00000000001c 34006 1726882663.82976: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882663.83336: no more pending results, returning what we have 34006 1726882663.83339: results queue empty 34006 1726882663.83340: checking for any_errors_fatal 34006 1726882663.83344: done checking for any_errors_fatal 34006 1726882663.83345: checking for max_fail_percentage 34006 1726882663.83346: done checking for max_fail_percentage 34006 1726882663.83346: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.83347: done checking to see if all hosts have failed 34006 1726882663.83348: getting the remaining hosts for this loop 34006 1726882663.83349: done getting the remaining hosts for this loop 34006 1726882663.83352: getting the next task for host managed_node3 34006 1726882663.83357: done getting next task for host managed_node3 34006 1726882663.83359: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34006 1726882663.83362: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.83374: getting variables 34006 1726882663.83376: in VariableManager get_vars() 34006 1726882663.83420: Calling all_inventory to load vars for managed_node3 34006 1726882663.83422: Calling groups_inventory to load vars for managed_node3 34006 1726882663.83425: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.83434: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.83437: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.83440: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.83655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.84249: done with get_vars() 34006 1726882663.84258: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34006 1726882663.84330: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:37:43 -0400 (0:00:00.029) 0:00:03.619 ****** 34006 1726882663.84359: entering _queue_task() for managed_node3/yum 34006 1726882663.84360: Creating lock for yum 34006 1726882663.84819: worker is 1 (out of 1 available) 34006 1726882663.84833: exiting _queue_task() for managed_node3/yum 34006 1726882663.84846: done queuing things up, now waiting for results queue to drain 34006 1726882663.84847: waiting for pending results... 34006 1726882663.85342: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34006 1726882663.85378: in run() - task 12673a56-9f93-11ce-7734-00000000001d 34006 1726882663.85475: variable 'ansible_search_path' from source: unknown 34006 1726882663.85478: variable 'ansible_search_path' from source: unknown 34006 1726882663.85481: calling self._execute() 34006 1726882663.85548: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.85605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.85609: variable 'omit' from source: magic vars 34006 1726882663.86343: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.86347: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.86408: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.86419: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.86426: when evaluation is False, skipping this task 34006 1726882663.86433: _execute() done 34006 1726882663.86439: dumping result to json 34006 1726882663.86454: done dumping result, returning 34006 1726882663.86464: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-11ce-7734-00000000001d] 34006 1726882663.86474: sending task result for task 12673a56-9f93-11ce-7734-00000000001d skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882663.86664: no more pending results, returning what we have 34006 1726882663.86668: results queue empty 34006 1726882663.86669: checking for any_errors_fatal 34006 1726882663.86673: done checking for any_errors_fatal 34006 1726882663.86674: checking for max_fail_percentage 34006 1726882663.86676: done checking for max_fail_percentage 34006 1726882663.86677: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.86677: done checking to see if all hosts have failed 34006 1726882663.86678: getting the remaining hosts for this loop 34006 1726882663.86679: done getting the remaining hosts for this loop 34006 1726882663.86683: getting the next task for host managed_node3 34006 1726882663.86691: done getting next task for host managed_node3 34006 1726882663.86696: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34006 1726882663.86699: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.86717: getting variables 34006 1726882663.86719: in VariableManager get_vars() 34006 1726882663.86763: Calling all_inventory to load vars for managed_node3 34006 1726882663.86766: Calling groups_inventory to load vars for managed_node3 34006 1726882663.86768: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.86779: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.86782: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.86785: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.86859: done sending task result for task 12673a56-9f93-11ce-7734-00000000001d 34006 1726882663.86863: WORKER PROCESS EXITING 34006 1726882663.87068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.87471: done with get_vars() 34006 1726882663.87480: done getting variables 34006 1726882663.87571: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:37:43 -0400 (0:00:00.033) 0:00:03.653 ****** 34006 1726882663.87682: entering _queue_task() for managed_node3/fail 34006 1726882663.87944: worker is 1 (out of 1 available) 34006 1726882663.87958: exiting _queue_task() for managed_node3/fail 34006 1726882663.87971: done queuing things up, now waiting for results queue to drain 34006 1726882663.87972: waiting for pending results... 34006 1726882663.88244: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34006 1726882663.88379: in run() - task 12673a56-9f93-11ce-7734-00000000001e 34006 1726882663.88405: variable 'ansible_search_path' from source: unknown 34006 1726882663.88420: variable 'ansible_search_path' from source: unknown 34006 1726882663.88472: calling self._execute() 34006 1726882663.88559: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.88571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.88586: variable 'omit' from source: magic vars 34006 1726882663.88956: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.89198: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.89202: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.89204: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.89207: when evaluation is False, skipping this task 34006 1726882663.89209: _execute() done 34006 1726882663.89211: dumping result to json 34006 1726882663.89214: done dumping result, returning 34006 1726882663.89217: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-11ce-7734-00000000001e] 34006 1726882663.89220: sending task result for task 12673a56-9f93-11ce-7734-00000000001e 34006 1726882663.89292: done sending task result for task 12673a56-9f93-11ce-7734-00000000001e 34006 1726882663.89297: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882663.89345: no more pending results, returning what we have 34006 1726882663.89349: results queue empty 34006 1726882663.89350: checking for any_errors_fatal 34006 1726882663.89355: done checking for any_errors_fatal 34006 1726882663.89356: checking for max_fail_percentage 34006 1726882663.89357: done checking for max_fail_percentage 34006 1726882663.89358: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.89359: done checking to see if all hosts have failed 34006 1726882663.89359: getting the remaining hosts for this loop 34006 1726882663.89361: done getting the remaining hosts for this loop 34006 1726882663.89368: getting the next task for host managed_node3 34006 1726882663.89381: done getting next task for host managed_node3 34006 1726882663.89386: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34006 1726882663.89470: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.89485: getting variables 34006 1726882663.89487: in VariableManager get_vars() 34006 1726882663.89541: Calling all_inventory to load vars for managed_node3 34006 1726882663.89544: Calling groups_inventory to load vars for managed_node3 34006 1726882663.89546: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.89558: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.89561: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.89564: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.90016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.90536: done with get_vars() 34006 1726882663.90545: done getting variables 34006 1726882663.90666: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:37:43 -0400 (0:00:00.030) 0:00:03.683 ****** 34006 1726882663.90739: entering _queue_task() for managed_node3/package 34006 1726882663.91126: worker is 1 (out of 1 available) 34006 1726882663.91138: exiting _queue_task() for managed_node3/package 34006 1726882663.91149: done queuing things up, now waiting for results queue to drain 34006 1726882663.91151: waiting for pending results... 34006 1726882663.91336: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 34006 1726882663.91413: in run() - task 12673a56-9f93-11ce-7734-00000000001f 34006 1726882663.91424: variable 'ansible_search_path' from source: unknown 34006 1726882663.91427: variable 'ansible_search_path' from source: unknown 34006 1726882663.91458: calling self._execute() 34006 1726882663.91518: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.91524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.91532: variable 'omit' from source: magic vars 34006 1726882663.91775: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.91783: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.91866: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.91869: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.91872: when evaluation is False, skipping this task 34006 1726882663.91875: _execute() done 34006 1726882663.91878: dumping result to json 34006 1726882663.91880: done dumping result, returning 34006 1726882663.91887: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-11ce-7734-00000000001f] 34006 1726882663.91895: sending task result for task 12673a56-9f93-11ce-7734-00000000001f 34006 1726882663.91978: done sending task result for task 12673a56-9f93-11ce-7734-00000000001f 34006 1726882663.91981: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882663.92027: no more pending results, returning what we have 34006 1726882663.92030: results queue empty 34006 1726882663.92030: checking for any_errors_fatal 34006 1726882663.92035: done checking for any_errors_fatal 34006 1726882663.92036: checking for max_fail_percentage 34006 1726882663.92037: done checking for max_fail_percentage 34006 1726882663.92038: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.92039: done checking to see if all hosts have failed 34006 1726882663.92039: getting the remaining hosts for this loop 34006 1726882663.92040: done getting the remaining hosts for this loop 34006 1726882663.92043: getting the next task for host managed_node3 34006 1726882663.92048: done getting next task for host managed_node3 34006 1726882663.92051: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34006 1726882663.92054: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.92066: getting variables 34006 1726882663.92067: in VariableManager get_vars() 34006 1726882663.92109: Calling all_inventory to load vars for managed_node3 34006 1726882663.92111: Calling groups_inventory to load vars for managed_node3 34006 1726882663.92112: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.92118: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.92120: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.92121: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.92224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.92345: done with get_vars() 34006 1726882663.92352: done getting variables 34006 1726882663.92387: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:37:43 -0400 (0:00:00.017) 0:00:03.700 ****** 34006 1726882663.92410: entering _queue_task() for managed_node3/package 34006 1726882663.92572: worker is 1 (out of 1 available) 34006 1726882663.92585: exiting _queue_task() for managed_node3/package 34006 1726882663.92599: done queuing things up, now waiting for results queue to drain 34006 1726882663.92601: waiting for pending results... 34006 1726882663.92757: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34006 1726882663.92827: in run() - task 12673a56-9f93-11ce-7734-000000000020 34006 1726882663.92838: variable 'ansible_search_path' from source: unknown 34006 1726882663.92842: variable 'ansible_search_path' from source: unknown 34006 1726882663.92870: calling self._execute() 34006 1726882663.92928: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.92965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.92968: variable 'omit' from source: magic vars 34006 1726882663.93281: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.93285: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.93499: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.93502: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.93504: when evaluation is False, skipping this task 34006 1726882663.93507: _execute() done 34006 1726882663.93509: dumping result to json 34006 1726882663.93511: done dumping result, returning 34006 1726882663.93514: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-11ce-7734-000000000020] 34006 1726882663.93520: sending task result for task 12673a56-9f93-11ce-7734-000000000020 34006 1726882663.93587: done sending task result for task 12673a56-9f93-11ce-7734-000000000020 34006 1726882663.93595: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882663.93667: no more pending results, returning what we have 34006 1726882663.93671: results queue empty 34006 1726882663.93672: checking for any_errors_fatal 34006 1726882663.93676: done checking for any_errors_fatal 34006 1726882663.93677: checking for max_fail_percentage 34006 1726882663.93678: done checking for max_fail_percentage 34006 1726882663.93679: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.93679: done checking to see if all hosts have failed 34006 1726882663.93680: getting the remaining hosts for this loop 34006 1726882663.93682: done getting the remaining hosts for this loop 34006 1726882663.93685: getting the next task for host managed_node3 34006 1726882663.93697: done getting next task for host managed_node3 34006 1726882663.93700: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34006 1726882663.93704: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.93718: getting variables 34006 1726882663.93719: in VariableManager get_vars() 34006 1726882663.93765: Calling all_inventory to load vars for managed_node3 34006 1726882663.93769: Calling groups_inventory to load vars for managed_node3 34006 1726882663.93771: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.93782: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.93785: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.93791: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.94122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.94259: done with get_vars() 34006 1726882663.94266: done getting variables 34006 1726882663.94314: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:37:43 -0400 (0:00:00.019) 0:00:03.719 ****** 34006 1726882663.94334: entering _queue_task() for managed_node3/package 34006 1726882663.94504: worker is 1 (out of 1 available) 34006 1726882663.94517: exiting _queue_task() for managed_node3/package 34006 1726882663.94527: done queuing things up, now waiting for results queue to drain 34006 1726882663.94529: waiting for pending results... 34006 1726882663.94687: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34006 1726882663.94766: in run() - task 12673a56-9f93-11ce-7734-000000000021 34006 1726882663.94777: variable 'ansible_search_path' from source: unknown 34006 1726882663.94780: variable 'ansible_search_path' from source: unknown 34006 1726882663.94812: calling self._execute() 34006 1726882663.94873: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.94877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.94887: variable 'omit' from source: magic vars 34006 1726882663.95138: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.95155: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.95229: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.95233: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.95236: when evaluation is False, skipping this task 34006 1726882663.95238: _execute() done 34006 1726882663.95241: dumping result to json 34006 1726882663.95244: done dumping result, returning 34006 1726882663.95251: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-11ce-7734-000000000021] 34006 1726882663.95256: sending task result for task 12673a56-9f93-11ce-7734-000000000021 34006 1726882663.95342: done sending task result for task 12673a56-9f93-11ce-7734-000000000021 34006 1726882663.95345: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882663.95388: no more pending results, returning what we have 34006 1726882663.95391: results queue empty 34006 1726882663.95391: checking for any_errors_fatal 34006 1726882663.95399: done checking for any_errors_fatal 34006 1726882663.95399: checking for max_fail_percentage 34006 1726882663.95401: done checking for max_fail_percentage 34006 1726882663.95401: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.95402: done checking to see if all hosts have failed 34006 1726882663.95403: getting the remaining hosts for this loop 34006 1726882663.95404: done getting the remaining hosts for this loop 34006 1726882663.95407: getting the next task for host managed_node3 34006 1726882663.95412: done getting next task for host managed_node3 34006 1726882663.95415: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34006 1726882663.95418: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.95429: getting variables 34006 1726882663.95430: in VariableManager get_vars() 34006 1726882663.95466: Calling all_inventory to load vars for managed_node3 34006 1726882663.95468: Calling groups_inventory to load vars for managed_node3 34006 1726882663.95470: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.95477: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.95479: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.95480: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.95585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.95705: done with get_vars() 34006 1726882663.95712: done getting variables 34006 1726882663.95774: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:37:43 -0400 (0:00:00.014) 0:00:03.734 ****** 34006 1726882663.95798: entering _queue_task() for managed_node3/service 34006 1726882663.95799: Creating lock for service 34006 1726882663.95969: worker is 1 (out of 1 available) 34006 1726882663.95982: exiting _queue_task() for managed_node3/service 34006 1726882663.95994: done queuing things up, now waiting for results queue to drain 34006 1726882663.95996: waiting for pending results... 34006 1726882663.96308: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34006 1726882663.96313: in run() - task 12673a56-9f93-11ce-7734-000000000022 34006 1726882663.96316: variable 'ansible_search_path' from source: unknown 34006 1726882663.96318: variable 'ansible_search_path' from source: unknown 34006 1726882663.96336: calling self._execute() 34006 1726882663.96415: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.96428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.96443: variable 'omit' from source: magic vars 34006 1726882663.96790: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.96810: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.96977: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.96981: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.96983: when evaluation is False, skipping this task 34006 1726882663.96985: _execute() done 34006 1726882663.96987: dumping result to json 34006 1726882663.96991: done dumping result, returning 34006 1726882663.96995: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-11ce-7734-000000000022] 34006 1726882663.96997: sending task result for task 12673a56-9f93-11ce-7734-000000000022 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882663.97231: no more pending results, returning what we have 34006 1726882663.97234: results queue empty 34006 1726882663.97234: checking for any_errors_fatal 34006 1726882663.97239: done checking for any_errors_fatal 34006 1726882663.97240: checking for max_fail_percentage 34006 1726882663.97241: done checking for max_fail_percentage 34006 1726882663.97242: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.97243: done checking to see if all hosts have failed 34006 1726882663.97243: getting the remaining hosts for this loop 34006 1726882663.97244: done getting the remaining hosts for this loop 34006 1726882663.97247: getting the next task for host managed_node3 34006 1726882663.97252: done getting next task for host managed_node3 34006 1726882663.97255: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34006 1726882663.97258: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.97270: getting variables 34006 1726882663.97271: in VariableManager get_vars() 34006 1726882663.97314: Calling all_inventory to load vars for managed_node3 34006 1726882663.97317: Calling groups_inventory to load vars for managed_node3 34006 1726882663.97319: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.97327: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.97330: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.97333: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.97580: done sending task result for task 12673a56-9f93-11ce-7734-000000000022 34006 1726882663.97583: WORKER PROCESS EXITING 34006 1726882663.97619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.97749: done with get_vars() 34006 1726882663.97756: done getting variables 34006 1726882663.97798: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:37:43 -0400 (0:00:00.020) 0:00:03.754 ****** 34006 1726882663.97817: entering _queue_task() for managed_node3/service 34006 1726882663.97979: worker is 1 (out of 1 available) 34006 1726882663.97998: exiting _queue_task() for managed_node3/service 34006 1726882663.98009: done queuing things up, now waiting for results queue to drain 34006 1726882663.98011: waiting for pending results... 34006 1726882663.98163: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34006 1726882663.98240: in run() - task 12673a56-9f93-11ce-7734-000000000023 34006 1726882663.98251: variable 'ansible_search_path' from source: unknown 34006 1726882663.98255: variable 'ansible_search_path' from source: unknown 34006 1726882663.98284: calling self._execute() 34006 1726882663.98347: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882663.98350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882663.98359: variable 'omit' from source: magic vars 34006 1726882663.98605: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.98616: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882663.98686: variable 'ansible_distribution_major_version' from source: facts 34006 1726882663.98695: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882663.98698: when evaluation is False, skipping this task 34006 1726882663.98701: _execute() done 34006 1726882663.98703: dumping result to json 34006 1726882663.98708: done dumping result, returning 34006 1726882663.98716: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-11ce-7734-000000000023] 34006 1726882663.98718: sending task result for task 12673a56-9f93-11ce-7734-000000000023 34006 1726882663.98801: done sending task result for task 12673a56-9f93-11ce-7734-000000000023 34006 1726882663.98804: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34006 1726882663.98844: no more pending results, returning what we have 34006 1726882663.98847: results queue empty 34006 1726882663.98848: checking for any_errors_fatal 34006 1726882663.98852: done checking for any_errors_fatal 34006 1726882663.98853: checking for max_fail_percentage 34006 1726882663.98854: done checking for max_fail_percentage 34006 1726882663.98855: checking to see if all hosts have failed and the running result is not ok 34006 1726882663.98855: done checking to see if all hosts have failed 34006 1726882663.98856: getting the remaining hosts for this loop 34006 1726882663.98857: done getting the remaining hosts for this loop 34006 1726882663.98860: getting the next task for host managed_node3 34006 1726882663.98865: done getting next task for host managed_node3 34006 1726882663.98867: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34006 1726882663.98870: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882663.98882: getting variables 34006 1726882663.98883: in VariableManager get_vars() 34006 1726882663.98926: Calling all_inventory to load vars for managed_node3 34006 1726882663.98927: Calling groups_inventory to load vars for managed_node3 34006 1726882663.98929: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882663.98934: Calling all_plugins_play to load vars for managed_node3 34006 1726882663.98936: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882663.98937: Calling groups_plugins_play to load vars for managed_node3 34006 1726882663.99043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882663.99197: done with get_vars() 34006 1726882663.99206: done getting variables 34006 1726882663.99259: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:37:43 -0400 (0:00:00.014) 0:00:03.769 ****** 34006 1726882663.99292: entering _queue_task() for managed_node3/service 34006 1726882663.99506: worker is 1 (out of 1 available) 34006 1726882663.99517: exiting _queue_task() for managed_node3/service 34006 1726882663.99528: done queuing things up, now waiting for results queue to drain 34006 1726882663.99530: waiting for pending results... 34006 1726882663.99811: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34006 1726882663.99911: in run() - task 12673a56-9f93-11ce-7734-000000000024 34006 1726882663.99929: variable 'ansible_search_path' from source: unknown 34006 1726882663.99938: variable 'ansible_search_path' from source: unknown 34006 1726882663.99974: calling self._execute() 34006 1726882664.00048: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.00052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.00061: variable 'omit' from source: magic vars 34006 1726882664.00510: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.00513: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.00625: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.00643: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.00651: when evaluation is False, skipping this task 34006 1726882664.00657: _execute() done 34006 1726882664.00664: dumping result to json 34006 1726882664.00670: done dumping result, returning 34006 1726882664.00727: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-11ce-7734-000000000024] 34006 1726882664.00730: sending task result for task 12673a56-9f93-11ce-7734-000000000024 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.00902: no more pending results, returning what we have 34006 1726882664.00905: results queue empty 34006 1726882664.00906: checking for any_errors_fatal 34006 1726882664.00912: done checking for any_errors_fatal 34006 1726882664.00913: checking for max_fail_percentage 34006 1726882664.00915: done checking for max_fail_percentage 34006 1726882664.00915: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.00916: done checking to see if all hosts have failed 34006 1726882664.00917: getting the remaining hosts for this loop 34006 1726882664.00919: done getting the remaining hosts for this loop 34006 1726882664.00923: getting the next task for host managed_node3 34006 1726882664.00931: done getting next task for host managed_node3 34006 1726882664.00934: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34006 1726882664.00943: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.00957: getting variables 34006 1726882664.00959: in VariableManager get_vars() 34006 1726882664.01099: Calling all_inventory to load vars for managed_node3 34006 1726882664.01102: Calling groups_inventory to load vars for managed_node3 34006 1726882664.01105: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.01142: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.01145: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.01148: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.01335: done sending task result for task 12673a56-9f93-11ce-7734-000000000024 34006 1726882664.01339: WORKER PROCESS EXITING 34006 1726882664.01358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.01482: done with get_vars() 34006 1726882664.01492: done getting variables 34006 1726882664.01533: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:37:44 -0400 (0:00:00.022) 0:00:03.791 ****** 34006 1726882664.01552: entering _queue_task() for managed_node3/service 34006 1726882664.01730: worker is 1 (out of 1 available) 34006 1726882664.01743: exiting _queue_task() for managed_node3/service 34006 1726882664.01753: done queuing things up, now waiting for results queue to drain 34006 1726882664.01755: waiting for pending results... 34006 1726882664.01903: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 34006 1726882664.01970: in run() - task 12673a56-9f93-11ce-7734-000000000025 34006 1726882664.01987: variable 'ansible_search_path' from source: unknown 34006 1726882664.02000: variable 'ansible_search_path' from source: unknown 34006 1726882664.02016: calling self._execute() 34006 1726882664.02069: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.02074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.02081: variable 'omit' from source: magic vars 34006 1726882664.02328: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.02332: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.02407: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.02412: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.02416: when evaluation is False, skipping this task 34006 1726882664.02419: _execute() done 34006 1726882664.02421: dumping result to json 34006 1726882664.02424: done dumping result, returning 34006 1726882664.02439: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-11ce-7734-000000000025] 34006 1726882664.02442: sending task result for task 12673a56-9f93-11ce-7734-000000000025 34006 1726882664.02516: done sending task result for task 12673a56-9f93-11ce-7734-000000000025 34006 1726882664.02519: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34006 1726882664.02570: no more pending results, returning what we have 34006 1726882664.02572: results queue empty 34006 1726882664.02573: checking for any_errors_fatal 34006 1726882664.02577: done checking for any_errors_fatal 34006 1726882664.02578: checking for max_fail_percentage 34006 1726882664.02579: done checking for max_fail_percentage 34006 1726882664.02580: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.02581: done checking to see if all hosts have failed 34006 1726882664.02581: getting the remaining hosts for this loop 34006 1726882664.02582: done getting the remaining hosts for this loop 34006 1726882664.02585: getting the next task for host managed_node3 34006 1726882664.02594: done getting next task for host managed_node3 34006 1726882664.02597: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34006 1726882664.02600: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.02612: getting variables 34006 1726882664.02613: in VariableManager get_vars() 34006 1726882664.02644: Calling all_inventory to load vars for managed_node3 34006 1726882664.02646: Calling groups_inventory to load vars for managed_node3 34006 1726882664.02647: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.02652: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.02654: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.02656: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.02758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.02880: done with get_vars() 34006 1726882664.02890: done getting variables 34006 1726882664.02929: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:37:44 -0400 (0:00:00.013) 0:00:03.805 ****** 34006 1726882664.02949: entering _queue_task() for managed_node3/copy 34006 1726882664.03136: worker is 1 (out of 1 available) 34006 1726882664.03149: exiting _queue_task() for managed_node3/copy 34006 1726882664.03161: done queuing things up, now waiting for results queue to drain 34006 1726882664.03162: waiting for pending results... 34006 1726882664.03611: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34006 1726882664.03617: in run() - task 12673a56-9f93-11ce-7734-000000000026 34006 1726882664.03620: variable 'ansible_search_path' from source: unknown 34006 1726882664.03622: variable 'ansible_search_path' from source: unknown 34006 1726882664.03625: calling self._execute() 34006 1726882664.03677: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.03690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.03707: variable 'omit' from source: magic vars 34006 1726882664.04127: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.04147: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.04234: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.04238: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.04241: when evaluation is False, skipping this task 34006 1726882664.04244: _execute() done 34006 1726882664.04246: dumping result to json 34006 1726882664.04251: done dumping result, returning 34006 1726882664.04258: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-11ce-7734-000000000026] 34006 1726882664.04261: sending task result for task 12673a56-9f93-11ce-7734-000000000026 34006 1726882664.04346: done sending task result for task 12673a56-9f93-11ce-7734-000000000026 34006 1726882664.04349: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.04422: no more pending results, returning what we have 34006 1726882664.04425: results queue empty 34006 1726882664.04425: checking for any_errors_fatal 34006 1726882664.04430: done checking for any_errors_fatal 34006 1726882664.04430: checking for max_fail_percentage 34006 1726882664.04432: done checking for max_fail_percentage 34006 1726882664.04433: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.04433: done checking to see if all hosts have failed 34006 1726882664.04434: getting the remaining hosts for this loop 34006 1726882664.04435: done getting the remaining hosts for this loop 34006 1726882664.04438: getting the next task for host managed_node3 34006 1726882664.04442: done getting next task for host managed_node3 34006 1726882664.04445: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34006 1726882664.04448: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.04459: getting variables 34006 1726882664.04460: in VariableManager get_vars() 34006 1726882664.04526: Calling all_inventory to load vars for managed_node3 34006 1726882664.04529: Calling groups_inventory to load vars for managed_node3 34006 1726882664.04530: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.04536: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.04537: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.04539: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.04634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.04752: done with get_vars() 34006 1726882664.04758: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:37:44 -0400 (0:00:00.018) 0:00:03.824 ****** 34006 1726882664.04813: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 34006 1726882664.04815: Creating lock for fedora.linux_system_roles.network_connections 34006 1726882664.04976: worker is 1 (out of 1 available) 34006 1726882664.04986: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 34006 1726882664.05001: done queuing things up, now waiting for results queue to drain 34006 1726882664.05003: waiting for pending results... 34006 1726882664.05139: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34006 1726882664.05213: in run() - task 12673a56-9f93-11ce-7734-000000000027 34006 1726882664.05227: variable 'ansible_search_path' from source: unknown 34006 1726882664.05230: variable 'ansible_search_path' from source: unknown 34006 1726882664.05252: calling self._execute() 34006 1726882664.05306: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.05310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.05318: variable 'omit' from source: magic vars 34006 1726882664.05541: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.05551: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.05627: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.05630: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.05633: when evaluation is False, skipping this task 34006 1726882664.05636: _execute() done 34006 1726882664.05638: dumping result to json 34006 1726882664.05642: done dumping result, returning 34006 1726882664.05649: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-11ce-7734-000000000027] 34006 1726882664.05653: sending task result for task 12673a56-9f93-11ce-7734-000000000027 34006 1726882664.05741: done sending task result for task 12673a56-9f93-11ce-7734-000000000027 34006 1726882664.05744: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.05804: no more pending results, returning what we have 34006 1726882664.05806: results queue empty 34006 1726882664.05807: checking for any_errors_fatal 34006 1726882664.05811: done checking for any_errors_fatal 34006 1726882664.05812: checking for max_fail_percentage 34006 1726882664.05814: done checking for max_fail_percentage 34006 1726882664.05814: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.05815: done checking to see if all hosts have failed 34006 1726882664.05816: getting the remaining hosts for this loop 34006 1726882664.05817: done getting the remaining hosts for this loop 34006 1726882664.05819: getting the next task for host managed_node3 34006 1726882664.05824: done getting next task for host managed_node3 34006 1726882664.05827: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34006 1726882664.05829: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.05839: getting variables 34006 1726882664.05840: in VariableManager get_vars() 34006 1726882664.05870: Calling all_inventory to load vars for managed_node3 34006 1726882664.05872: Calling groups_inventory to load vars for managed_node3 34006 1726882664.05873: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.05879: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.05880: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.05882: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.05984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.06124: done with get_vars() 34006 1726882664.06130: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:37:44 -0400 (0:00:00.013) 0:00:03.838 ****** 34006 1726882664.06179: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 34006 1726882664.06180: Creating lock for fedora.linux_system_roles.network_state 34006 1726882664.06348: worker is 1 (out of 1 available) 34006 1726882664.06361: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 34006 1726882664.06371: done queuing things up, now waiting for results queue to drain 34006 1726882664.06373: waiting for pending results... 34006 1726882664.06506: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 34006 1726882664.06571: in run() - task 12673a56-9f93-11ce-7734-000000000028 34006 1726882664.06583: variable 'ansible_search_path' from source: unknown 34006 1726882664.06586: variable 'ansible_search_path' from source: unknown 34006 1726882664.06613: calling self._execute() 34006 1726882664.06665: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.06669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.06678: variable 'omit' from source: magic vars 34006 1726882664.06911: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.06920: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.07036: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.07039: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.07041: when evaluation is False, skipping this task 34006 1726882664.07043: _execute() done 34006 1726882664.07046: dumping result to json 34006 1726882664.07047: done dumping result, returning 34006 1726882664.07049: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-11ce-7734-000000000028] 34006 1726882664.07051: sending task result for task 12673a56-9f93-11ce-7734-000000000028 34006 1726882664.07116: done sending task result for task 12673a56-9f93-11ce-7734-000000000028 34006 1726882664.07119: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.07173: no more pending results, returning what we have 34006 1726882664.07176: results queue empty 34006 1726882664.07176: checking for any_errors_fatal 34006 1726882664.07183: done checking for any_errors_fatal 34006 1726882664.07183: checking for max_fail_percentage 34006 1726882664.07184: done checking for max_fail_percentage 34006 1726882664.07185: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.07185: done checking to see if all hosts have failed 34006 1726882664.07186: getting the remaining hosts for this loop 34006 1726882664.07186: done getting the remaining hosts for this loop 34006 1726882664.07191: getting the next task for host managed_node3 34006 1726882664.07197: done getting next task for host managed_node3 34006 1726882664.07200: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34006 1726882664.07202: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.07210: getting variables 34006 1726882664.07211: in VariableManager get_vars() 34006 1726882664.07243: Calling all_inventory to load vars for managed_node3 34006 1726882664.07245: Calling groups_inventory to load vars for managed_node3 34006 1726882664.07246: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.07251: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.07253: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.07255: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.07357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.07477: done with get_vars() 34006 1726882664.07483: done getting variables 34006 1726882664.07524: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:37:44 -0400 (0:00:00.013) 0:00:03.851 ****** 34006 1726882664.07544: entering _queue_task() for managed_node3/debug 34006 1726882664.07713: worker is 1 (out of 1 available) 34006 1726882664.07727: exiting _queue_task() for managed_node3/debug 34006 1726882664.07737: done queuing things up, now waiting for results queue to drain 34006 1726882664.07738: waiting for pending results... 34006 1726882664.07879: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34006 1726882664.07952: in run() - task 12673a56-9f93-11ce-7734-000000000029 34006 1726882664.07968: variable 'ansible_search_path' from source: unknown 34006 1726882664.07972: variable 'ansible_search_path' from source: unknown 34006 1726882664.07996: calling self._execute() 34006 1726882664.08047: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.08051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.08059: variable 'omit' from source: magic vars 34006 1726882664.08297: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.08302: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.08373: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.08376: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.08379: when evaluation is False, skipping this task 34006 1726882664.08381: _execute() done 34006 1726882664.08384: dumping result to json 34006 1726882664.08391: done dumping result, returning 34006 1726882664.08397: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-11ce-7734-000000000029] 34006 1726882664.08413: sending task result for task 12673a56-9f93-11ce-7734-000000000029 34006 1726882664.08491: done sending task result for task 12673a56-9f93-11ce-7734-000000000029 34006 1726882664.08496: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882664.08548: no more pending results, returning what we have 34006 1726882664.08550: results queue empty 34006 1726882664.08551: checking for any_errors_fatal 34006 1726882664.08555: done checking for any_errors_fatal 34006 1726882664.08555: checking for max_fail_percentage 34006 1726882664.08557: done checking for max_fail_percentage 34006 1726882664.08557: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.08558: done checking to see if all hosts have failed 34006 1726882664.08559: getting the remaining hosts for this loop 34006 1726882664.08560: done getting the remaining hosts for this loop 34006 1726882664.08562: getting the next task for host managed_node3 34006 1726882664.08567: done getting next task for host managed_node3 34006 1726882664.08570: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34006 1726882664.08573: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.08584: getting variables 34006 1726882664.08586: in VariableManager get_vars() 34006 1726882664.08622: Calling all_inventory to load vars for managed_node3 34006 1726882664.08624: Calling groups_inventory to load vars for managed_node3 34006 1726882664.08626: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.08631: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.08633: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.08634: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.08761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.08877: done with get_vars() 34006 1726882664.08884: done getting variables 34006 1726882664.08923: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:37:44 -0400 (0:00:00.013) 0:00:03.865 ****** 34006 1726882664.08944: entering _queue_task() for managed_node3/debug 34006 1726882664.09100: worker is 1 (out of 1 available) 34006 1726882664.09112: exiting _queue_task() for managed_node3/debug 34006 1726882664.09123: done queuing things up, now waiting for results queue to drain 34006 1726882664.09125: waiting for pending results... 34006 1726882664.09287: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34006 1726882664.09365: in run() - task 12673a56-9f93-11ce-7734-00000000002a 34006 1726882664.09375: variable 'ansible_search_path' from source: unknown 34006 1726882664.09379: variable 'ansible_search_path' from source: unknown 34006 1726882664.09408: calling self._execute() 34006 1726882664.09461: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.09467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.09482: variable 'omit' from source: magic vars 34006 1726882664.09727: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.09736: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.09816: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.09820: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.09823: when evaluation is False, skipping this task 34006 1726882664.09826: _execute() done 34006 1726882664.09829: dumping result to json 34006 1726882664.09831: done dumping result, returning 34006 1726882664.09837: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-11ce-7734-00000000002a] 34006 1726882664.09842: sending task result for task 12673a56-9f93-11ce-7734-00000000002a 34006 1726882664.09934: done sending task result for task 12673a56-9f93-11ce-7734-00000000002a 34006 1726882664.09937: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882664.09977: no more pending results, returning what we have 34006 1726882664.09979: results queue empty 34006 1726882664.09980: checking for any_errors_fatal 34006 1726882664.09983: done checking for any_errors_fatal 34006 1726882664.09984: checking for max_fail_percentage 34006 1726882664.09985: done checking for max_fail_percentage 34006 1726882664.09986: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.09987: done checking to see if all hosts have failed 34006 1726882664.09990: getting the remaining hosts for this loop 34006 1726882664.09991: done getting the remaining hosts for this loop 34006 1726882664.09996: getting the next task for host managed_node3 34006 1726882664.10001: done getting next task for host managed_node3 34006 1726882664.10005: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34006 1726882664.10007: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.10020: getting variables 34006 1726882664.10021: in VariableManager get_vars() 34006 1726882664.10060: Calling all_inventory to load vars for managed_node3 34006 1726882664.10062: Calling groups_inventory to load vars for managed_node3 34006 1726882664.10064: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.10069: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.10071: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.10072: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.10176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.10298: done with get_vars() 34006 1726882664.10306: done getting variables 34006 1726882664.10342: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:37:44 -0400 (0:00:00.014) 0:00:03.879 ****** 34006 1726882664.10361: entering _queue_task() for managed_node3/debug 34006 1726882664.10521: worker is 1 (out of 1 available) 34006 1726882664.10533: exiting _queue_task() for managed_node3/debug 34006 1726882664.10545: done queuing things up, now waiting for results queue to drain 34006 1726882664.10547: waiting for pending results... 34006 1726882664.10720: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34006 1726882664.10795: in run() - task 12673a56-9f93-11ce-7734-00000000002b 34006 1726882664.10805: variable 'ansible_search_path' from source: unknown 34006 1726882664.10809: variable 'ansible_search_path' from source: unknown 34006 1726882664.10837: calling self._execute() 34006 1726882664.10891: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.10896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.10904: variable 'omit' from source: magic vars 34006 1726882664.11136: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.11145: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.11224: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.11227: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.11230: when evaluation is False, skipping this task 34006 1726882664.11234: _execute() done 34006 1726882664.11237: dumping result to json 34006 1726882664.11240: done dumping result, returning 34006 1726882664.11247: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-11ce-7734-00000000002b] 34006 1726882664.11251: sending task result for task 12673a56-9f93-11ce-7734-00000000002b 34006 1726882664.11342: done sending task result for task 12673a56-9f93-11ce-7734-00000000002b 34006 1726882664.11345: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882664.11384: no more pending results, returning what we have 34006 1726882664.11387: results queue empty 34006 1726882664.11390: checking for any_errors_fatal 34006 1726882664.11396: done checking for any_errors_fatal 34006 1726882664.11397: checking for max_fail_percentage 34006 1726882664.11399: done checking for max_fail_percentage 34006 1726882664.11399: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.11400: done checking to see if all hosts have failed 34006 1726882664.11401: getting the remaining hosts for this loop 34006 1726882664.11410: done getting the remaining hosts for this loop 34006 1726882664.11413: getting the next task for host managed_node3 34006 1726882664.11418: done getting next task for host managed_node3 34006 1726882664.11421: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34006 1726882664.11424: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.11435: getting variables 34006 1726882664.11436: in VariableManager get_vars() 34006 1726882664.11464: Calling all_inventory to load vars for managed_node3 34006 1726882664.11466: Calling groups_inventory to load vars for managed_node3 34006 1726882664.11467: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.11472: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.11474: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.11475: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.11610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.11730: done with get_vars() 34006 1726882664.11738: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:37:44 -0400 (0:00:00.014) 0:00:03.894 ****** 34006 1726882664.11800: entering _queue_task() for managed_node3/ping 34006 1726882664.11801: Creating lock for ping 34006 1726882664.11966: worker is 1 (out of 1 available) 34006 1726882664.11978: exiting _queue_task() for managed_node3/ping 34006 1726882664.11988: done queuing things up, now waiting for results queue to drain 34006 1726882664.11990: waiting for pending results... 34006 1726882664.12152: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 34006 1726882664.12226: in run() - task 12673a56-9f93-11ce-7734-00000000002c 34006 1726882664.12239: variable 'ansible_search_path' from source: unknown 34006 1726882664.12242: variable 'ansible_search_path' from source: unknown 34006 1726882664.12267: calling self._execute() 34006 1726882664.12327: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.12331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.12342: variable 'omit' from source: magic vars 34006 1726882664.12591: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.12604: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.12679: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.12683: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.12686: when evaluation is False, skipping this task 34006 1726882664.12695: _execute() done 34006 1726882664.12698: dumping result to json 34006 1726882664.12700: done dumping result, returning 34006 1726882664.12707: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-11ce-7734-00000000002c] 34006 1726882664.12712: sending task result for task 12673a56-9f93-11ce-7734-00000000002c 34006 1726882664.12787: done sending task result for task 12673a56-9f93-11ce-7734-00000000002c 34006 1726882664.12790: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.12834: no more pending results, returning what we have 34006 1726882664.12838: results queue empty 34006 1726882664.12839: checking for any_errors_fatal 34006 1726882664.12843: done checking for any_errors_fatal 34006 1726882664.12844: checking for max_fail_percentage 34006 1726882664.12845: done checking for max_fail_percentage 34006 1726882664.12846: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.12847: done checking to see if all hosts have failed 34006 1726882664.12847: getting the remaining hosts for this loop 34006 1726882664.12849: done getting the remaining hosts for this loop 34006 1726882664.12852: getting the next task for host managed_node3 34006 1726882664.12859: done getting next task for host managed_node3 34006 1726882664.12861: ^ task is: TASK: meta (role_complete) 34006 1726882664.12864: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.12876: getting variables 34006 1726882664.12877: in VariableManager get_vars() 34006 1726882664.12916: Calling all_inventory to load vars for managed_node3 34006 1726882664.12918: Calling groups_inventory to load vars for managed_node3 34006 1726882664.12920: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.12927: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.12930: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.12933: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.13032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.13151: done with get_vars() 34006 1726882664.13158: done getting variables 34006 1726882664.13211: done queuing things up, now waiting for results queue to drain 34006 1726882664.13212: results queue empty 34006 1726882664.13212: checking for any_errors_fatal 34006 1726882664.13214: done checking for any_errors_fatal 34006 1726882664.13215: checking for max_fail_percentage 34006 1726882664.13216: done checking for max_fail_percentage 34006 1726882664.13216: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.13216: done checking to see if all hosts have failed 34006 1726882664.13217: getting the remaining hosts for this loop 34006 1726882664.13217: done getting the remaining hosts for this loop 34006 1726882664.13219: getting the next task for host managed_node3 34006 1726882664.13222: done getting next task for host managed_node3 34006 1726882664.13224: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34006 1726882664.13225: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.13230: getting variables 34006 1726882664.13231: in VariableManager get_vars() 34006 1726882664.13243: Calling all_inventory to load vars for managed_node3 34006 1726882664.13244: Calling groups_inventory to load vars for managed_node3 34006 1726882664.13245: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.13248: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.13250: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.13252: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.13335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.13608: done with get_vars() 34006 1726882664.13614: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:37:44 -0400 (0:00:00.018) 0:00:03.912 ****** 34006 1726882664.13660: entering _queue_task() for managed_node3/include_tasks 34006 1726882664.13838: worker is 1 (out of 1 available) 34006 1726882664.13849: exiting _queue_task() for managed_node3/include_tasks 34006 1726882664.13859: done queuing things up, now waiting for results queue to drain 34006 1726882664.13861: waiting for pending results... 34006 1726882664.14025: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34006 1726882664.14110: in run() - task 12673a56-9f93-11ce-7734-000000000063 34006 1726882664.14121: variable 'ansible_search_path' from source: unknown 34006 1726882664.14124: variable 'ansible_search_path' from source: unknown 34006 1726882664.14151: calling self._execute() 34006 1726882664.14213: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.14217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.14226: variable 'omit' from source: magic vars 34006 1726882664.14480: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.14489: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.14568: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.14574: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.14577: when evaluation is False, skipping this task 34006 1726882664.14579: _execute() done 34006 1726882664.14582: dumping result to json 34006 1726882664.14584: done dumping result, returning 34006 1726882664.14628: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-11ce-7734-000000000063] 34006 1726882664.14632: sending task result for task 12673a56-9f93-11ce-7734-000000000063 34006 1726882664.14709: done sending task result for task 12673a56-9f93-11ce-7734-000000000063 34006 1726882664.14712: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.14760: no more pending results, returning what we have 34006 1726882664.14762: results queue empty 34006 1726882664.14763: checking for any_errors_fatal 34006 1726882664.14765: done checking for any_errors_fatal 34006 1726882664.14766: checking for max_fail_percentage 34006 1726882664.14767: done checking for max_fail_percentage 34006 1726882664.14767: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.14768: done checking to see if all hosts have failed 34006 1726882664.14769: getting the remaining hosts for this loop 34006 1726882664.14770: done getting the remaining hosts for this loop 34006 1726882664.14773: getting the next task for host managed_node3 34006 1726882664.14778: done getting next task for host managed_node3 34006 1726882664.14781: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34006 1726882664.14784: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.14796: getting variables 34006 1726882664.14797: in VariableManager get_vars() 34006 1726882664.14831: Calling all_inventory to load vars for managed_node3 34006 1726882664.14833: Calling groups_inventory to load vars for managed_node3 34006 1726882664.14834: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.14840: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.14841: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.14843: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.14945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.15069: done with get_vars() 34006 1726882664.15076: done getting variables 34006 1726882664.15117: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:37:44 -0400 (0:00:00.014) 0:00:03.927 ****** 34006 1726882664.15139: entering _queue_task() for managed_node3/debug 34006 1726882664.15302: worker is 1 (out of 1 available) 34006 1726882664.15313: exiting _queue_task() for managed_node3/debug 34006 1726882664.15323: done queuing things up, now waiting for results queue to drain 34006 1726882664.15325: waiting for pending results... 34006 1726882664.15483: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 34006 1726882664.15553: in run() - task 12673a56-9f93-11ce-7734-000000000064 34006 1726882664.15564: variable 'ansible_search_path' from source: unknown 34006 1726882664.15569: variable 'ansible_search_path' from source: unknown 34006 1726882664.15598: calling self._execute() 34006 1726882664.15661: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.15665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.15675: variable 'omit' from source: magic vars 34006 1726882664.15929: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.15938: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.16021: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.16025: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.16028: when evaluation is False, skipping this task 34006 1726882664.16031: _execute() done 34006 1726882664.16033: dumping result to json 34006 1726882664.16035: done dumping result, returning 34006 1726882664.16038: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-11ce-7734-000000000064] 34006 1726882664.16042: sending task result for task 12673a56-9f93-11ce-7734-000000000064 34006 1726882664.16123: done sending task result for task 12673a56-9f93-11ce-7734-000000000064 34006 1726882664.16126: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882664.16165: no more pending results, returning what we have 34006 1726882664.16168: results queue empty 34006 1726882664.16169: checking for any_errors_fatal 34006 1726882664.16173: done checking for any_errors_fatal 34006 1726882664.16173: checking for max_fail_percentage 34006 1726882664.16175: done checking for max_fail_percentage 34006 1726882664.16176: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.16176: done checking to see if all hosts have failed 34006 1726882664.16177: getting the remaining hosts for this loop 34006 1726882664.16178: done getting the remaining hosts for this loop 34006 1726882664.16181: getting the next task for host managed_node3 34006 1726882664.16186: done getting next task for host managed_node3 34006 1726882664.16192: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34006 1726882664.16196: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.16209: getting variables 34006 1726882664.16211: in VariableManager get_vars() 34006 1726882664.16245: Calling all_inventory to load vars for managed_node3 34006 1726882664.16247: Calling groups_inventory to load vars for managed_node3 34006 1726882664.16250: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.16256: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.16258: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.16260: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.16396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.16513: done with get_vars() 34006 1726882664.16519: done getting variables 34006 1726882664.16554: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:37:44 -0400 (0:00:00.014) 0:00:03.942 ****** 34006 1726882664.16574: entering _queue_task() for managed_node3/fail 34006 1726882664.16735: worker is 1 (out of 1 available) 34006 1726882664.16746: exiting _queue_task() for managed_node3/fail 34006 1726882664.16756: done queuing things up, now waiting for results queue to drain 34006 1726882664.16758: waiting for pending results... 34006 1726882664.16902: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34006 1726882664.16973: in run() - task 12673a56-9f93-11ce-7734-000000000065 34006 1726882664.16984: variable 'ansible_search_path' from source: unknown 34006 1726882664.16990: variable 'ansible_search_path' from source: unknown 34006 1726882664.17016: calling self._execute() 34006 1726882664.17066: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.17070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.17079: variable 'omit' from source: magic vars 34006 1726882664.17317: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.17326: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.17401: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.17405: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.17407: when evaluation is False, skipping this task 34006 1726882664.17411: _execute() done 34006 1726882664.17414: dumping result to json 34006 1726882664.17417: done dumping result, returning 34006 1726882664.17423: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-11ce-7734-000000000065] 34006 1726882664.17425: sending task result for task 12673a56-9f93-11ce-7734-000000000065 34006 1726882664.17507: done sending task result for task 12673a56-9f93-11ce-7734-000000000065 34006 1726882664.17510: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.17562: no more pending results, returning what we have 34006 1726882664.17564: results queue empty 34006 1726882664.17565: checking for any_errors_fatal 34006 1726882664.17569: done checking for any_errors_fatal 34006 1726882664.17569: checking for max_fail_percentage 34006 1726882664.17571: done checking for max_fail_percentage 34006 1726882664.17571: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.17572: done checking to see if all hosts have failed 34006 1726882664.17573: getting the remaining hosts for this loop 34006 1726882664.17574: done getting the remaining hosts for this loop 34006 1726882664.17577: getting the next task for host managed_node3 34006 1726882664.17581: done getting next task for host managed_node3 34006 1726882664.17584: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34006 1726882664.17587: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.17603: getting variables 34006 1726882664.17605: in VariableManager get_vars() 34006 1726882664.17636: Calling all_inventory to load vars for managed_node3 34006 1726882664.17638: Calling groups_inventory to load vars for managed_node3 34006 1726882664.17639: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.17645: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.17646: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.17648: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.17747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.17867: done with get_vars() 34006 1726882664.17874: done getting variables 34006 1726882664.17912: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:37:44 -0400 (0:00:00.013) 0:00:03.955 ****** 34006 1726882664.17932: entering _queue_task() for managed_node3/fail 34006 1726882664.18096: worker is 1 (out of 1 available) 34006 1726882664.18108: exiting _queue_task() for managed_node3/fail 34006 1726882664.18117: done queuing things up, now waiting for results queue to drain 34006 1726882664.18119: waiting for pending results... 34006 1726882664.18255: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34006 1726882664.18319: in run() - task 12673a56-9f93-11ce-7734-000000000066 34006 1726882664.18329: variable 'ansible_search_path' from source: unknown 34006 1726882664.18332: variable 'ansible_search_path' from source: unknown 34006 1726882664.18359: calling self._execute() 34006 1726882664.18414: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.18418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.18427: variable 'omit' from source: magic vars 34006 1726882664.18654: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.18663: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.18741: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.18744: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.18747: when evaluation is False, skipping this task 34006 1726882664.18751: _execute() done 34006 1726882664.18753: dumping result to json 34006 1726882664.18757: done dumping result, returning 34006 1726882664.18764: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-11ce-7734-000000000066] 34006 1726882664.18767: sending task result for task 12673a56-9f93-11ce-7734-000000000066 34006 1726882664.18854: done sending task result for task 12673a56-9f93-11ce-7734-000000000066 34006 1726882664.18857: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.18918: no more pending results, returning what we have 34006 1726882664.18921: results queue empty 34006 1726882664.18921: checking for any_errors_fatal 34006 1726882664.18926: done checking for any_errors_fatal 34006 1726882664.18926: checking for max_fail_percentage 34006 1726882664.18927: done checking for max_fail_percentage 34006 1726882664.18928: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.18929: done checking to see if all hosts have failed 34006 1726882664.18930: getting the remaining hosts for this loop 34006 1726882664.18931: done getting the remaining hosts for this loop 34006 1726882664.18933: getting the next task for host managed_node3 34006 1726882664.18938: done getting next task for host managed_node3 34006 1726882664.18941: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34006 1726882664.18944: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.18955: getting variables 34006 1726882664.18956: in VariableManager get_vars() 34006 1726882664.18982: Calling all_inventory to load vars for managed_node3 34006 1726882664.18983: Calling groups_inventory to load vars for managed_node3 34006 1726882664.18985: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.18992: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.18996: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.18998: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.19125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.19243: done with get_vars() 34006 1726882664.19249: done getting variables 34006 1726882664.19291: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:37:44 -0400 (0:00:00.013) 0:00:03.969 ****** 34006 1726882664.19312: entering _queue_task() for managed_node3/fail 34006 1726882664.19467: worker is 1 (out of 1 available) 34006 1726882664.19479: exiting _queue_task() for managed_node3/fail 34006 1726882664.19492: done queuing things up, now waiting for results queue to drain 34006 1726882664.19495: waiting for pending results... 34006 1726882664.19632: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34006 1726882664.19699: in run() - task 12673a56-9f93-11ce-7734-000000000067 34006 1726882664.19710: variable 'ansible_search_path' from source: unknown 34006 1726882664.19713: variable 'ansible_search_path' from source: unknown 34006 1726882664.19739: calling self._execute() 34006 1726882664.19791: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.19797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.19804: variable 'omit' from source: magic vars 34006 1726882664.20031: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.20039: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.20116: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.20120: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.20122: when evaluation is False, skipping this task 34006 1726882664.20125: _execute() done 34006 1726882664.20127: dumping result to json 34006 1726882664.20132: done dumping result, returning 34006 1726882664.20139: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-11ce-7734-000000000067] 34006 1726882664.20143: sending task result for task 12673a56-9f93-11ce-7734-000000000067 34006 1726882664.20226: done sending task result for task 12673a56-9f93-11ce-7734-000000000067 34006 1726882664.20229: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.20296: no more pending results, returning what we have 34006 1726882664.20299: results queue empty 34006 1726882664.20300: checking for any_errors_fatal 34006 1726882664.20304: done checking for any_errors_fatal 34006 1726882664.20304: checking for max_fail_percentage 34006 1726882664.20306: done checking for max_fail_percentage 34006 1726882664.20306: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.20307: done checking to see if all hosts have failed 34006 1726882664.20308: getting the remaining hosts for this loop 34006 1726882664.20309: done getting the remaining hosts for this loop 34006 1726882664.20312: getting the next task for host managed_node3 34006 1726882664.20316: done getting next task for host managed_node3 34006 1726882664.20319: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34006 1726882664.20321: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.20329: getting variables 34006 1726882664.20330: in VariableManager get_vars() 34006 1726882664.20356: Calling all_inventory to load vars for managed_node3 34006 1726882664.20357: Calling groups_inventory to load vars for managed_node3 34006 1726882664.20359: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.20364: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.20365: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.20367: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.20467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.20589: done with get_vars() 34006 1726882664.20597: done getting variables 34006 1726882664.20631: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:37:44 -0400 (0:00:00.013) 0:00:03.982 ****** 34006 1726882664.20652: entering _queue_task() for managed_node3/dnf 34006 1726882664.20812: worker is 1 (out of 1 available) 34006 1726882664.20824: exiting _queue_task() for managed_node3/dnf 34006 1726882664.20834: done queuing things up, now waiting for results queue to drain 34006 1726882664.20836: waiting for pending results... 34006 1726882664.20967: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34006 1726882664.21039: in run() - task 12673a56-9f93-11ce-7734-000000000068 34006 1726882664.21049: variable 'ansible_search_path' from source: unknown 34006 1726882664.21053: variable 'ansible_search_path' from source: unknown 34006 1726882664.21079: calling self._execute() 34006 1726882664.21132: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.21136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.21144: variable 'omit' from source: magic vars 34006 1726882664.21370: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.21379: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.21456: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.21460: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.21463: when evaluation is False, skipping this task 34006 1726882664.21466: _execute() done 34006 1726882664.21468: dumping result to json 34006 1726882664.21473: done dumping result, returning 34006 1726882664.21479: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-11ce-7734-000000000068] 34006 1726882664.21485: sending task result for task 12673a56-9f93-11ce-7734-000000000068 34006 1726882664.21570: done sending task result for task 12673a56-9f93-11ce-7734-000000000068 34006 1726882664.21573: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.21634: no more pending results, returning what we have 34006 1726882664.21637: results queue empty 34006 1726882664.21638: checking for any_errors_fatal 34006 1726882664.21642: done checking for any_errors_fatal 34006 1726882664.21642: checking for max_fail_percentage 34006 1726882664.21644: done checking for max_fail_percentage 34006 1726882664.21644: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.21645: done checking to see if all hosts have failed 34006 1726882664.21646: getting the remaining hosts for this loop 34006 1726882664.21647: done getting the remaining hosts for this loop 34006 1726882664.21650: getting the next task for host managed_node3 34006 1726882664.21654: done getting next task for host managed_node3 34006 1726882664.21658: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34006 1726882664.21660: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.21672: getting variables 34006 1726882664.21673: in VariableManager get_vars() 34006 1726882664.21708: Calling all_inventory to load vars for managed_node3 34006 1726882664.21710: Calling groups_inventory to load vars for managed_node3 34006 1726882664.21711: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.21717: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.21718: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.21720: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.21850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.21968: done with get_vars() 34006 1726882664.21974: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34006 1726882664.22028: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:37:44 -0400 (0:00:00.013) 0:00:03.996 ****** 34006 1726882664.22047: entering _queue_task() for managed_node3/yum 34006 1726882664.22215: worker is 1 (out of 1 available) 34006 1726882664.22229: exiting _queue_task() for managed_node3/yum 34006 1726882664.22240: done queuing things up, now waiting for results queue to drain 34006 1726882664.22242: waiting for pending results... 34006 1726882664.22381: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34006 1726882664.22459: in run() - task 12673a56-9f93-11ce-7734-000000000069 34006 1726882664.22469: variable 'ansible_search_path' from source: unknown 34006 1726882664.22479: variable 'ansible_search_path' from source: unknown 34006 1726882664.22507: calling self._execute() 34006 1726882664.22561: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.22565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.22574: variable 'omit' from source: magic vars 34006 1726882664.22818: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.22827: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.22905: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.22910: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.22913: when evaluation is False, skipping this task 34006 1726882664.22916: _execute() done 34006 1726882664.22919: dumping result to json 34006 1726882664.22921: done dumping result, returning 34006 1726882664.22924: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-11ce-7734-000000000069] 34006 1726882664.22927: sending task result for task 12673a56-9f93-11ce-7734-000000000069 34006 1726882664.23013: done sending task result for task 12673a56-9f93-11ce-7734-000000000069 34006 1726882664.23016: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.23075: no more pending results, returning what we have 34006 1726882664.23078: results queue empty 34006 1726882664.23079: checking for any_errors_fatal 34006 1726882664.23084: done checking for any_errors_fatal 34006 1726882664.23084: checking for max_fail_percentage 34006 1726882664.23086: done checking for max_fail_percentage 34006 1726882664.23086: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.23089: done checking to see if all hosts have failed 34006 1726882664.23090: getting the remaining hosts for this loop 34006 1726882664.23091: done getting the remaining hosts for this loop 34006 1726882664.23099: getting the next task for host managed_node3 34006 1726882664.23105: done getting next task for host managed_node3 34006 1726882664.23108: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34006 1726882664.23110: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.23121: getting variables 34006 1726882664.23122: in VariableManager get_vars() 34006 1726882664.23153: Calling all_inventory to load vars for managed_node3 34006 1726882664.23155: Calling groups_inventory to load vars for managed_node3 34006 1726882664.23156: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.23162: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.23163: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.23165: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.23271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.23394: done with get_vars() 34006 1726882664.23401: done getting variables 34006 1726882664.23438: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:37:44 -0400 (0:00:00.014) 0:00:04.010 ****** 34006 1726882664.23458: entering _queue_task() for managed_node3/fail 34006 1726882664.23629: worker is 1 (out of 1 available) 34006 1726882664.23642: exiting _queue_task() for managed_node3/fail 34006 1726882664.23654: done queuing things up, now waiting for results queue to drain 34006 1726882664.23655: waiting for pending results... 34006 1726882664.23798: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34006 1726882664.23868: in run() - task 12673a56-9f93-11ce-7734-00000000006a 34006 1726882664.23883: variable 'ansible_search_path' from source: unknown 34006 1726882664.23890: variable 'ansible_search_path' from source: unknown 34006 1726882664.23910: calling self._execute() 34006 1726882664.23961: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.23965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.23974: variable 'omit' from source: magic vars 34006 1726882664.24211: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.24220: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.24295: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.24299: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.24302: when evaluation is False, skipping this task 34006 1726882664.24304: _execute() done 34006 1726882664.24307: dumping result to json 34006 1726882664.24311: done dumping result, returning 34006 1726882664.24316: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-11ce-7734-00000000006a] 34006 1726882664.24325: sending task result for task 12673a56-9f93-11ce-7734-00000000006a 34006 1726882664.24408: done sending task result for task 12673a56-9f93-11ce-7734-00000000006a 34006 1726882664.24411: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.24469: no more pending results, returning what we have 34006 1726882664.24471: results queue empty 34006 1726882664.24472: checking for any_errors_fatal 34006 1726882664.24477: done checking for any_errors_fatal 34006 1726882664.24477: checking for max_fail_percentage 34006 1726882664.24479: done checking for max_fail_percentage 34006 1726882664.24479: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.24480: done checking to see if all hosts have failed 34006 1726882664.24481: getting the remaining hosts for this loop 34006 1726882664.24482: done getting the remaining hosts for this loop 34006 1726882664.24485: getting the next task for host managed_node3 34006 1726882664.24490: done getting next task for host managed_node3 34006 1726882664.24495: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34006 1726882664.24497: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.24511: getting variables 34006 1726882664.24513: in VariableManager get_vars() 34006 1726882664.24549: Calling all_inventory to load vars for managed_node3 34006 1726882664.24551: Calling groups_inventory to load vars for managed_node3 34006 1726882664.24552: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.24557: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.24559: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.24561: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.24691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.24809: done with get_vars() 34006 1726882664.24816: done getting variables 34006 1726882664.24854: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:37:44 -0400 (0:00:00.014) 0:00:04.025 ****** 34006 1726882664.24875: entering _queue_task() for managed_node3/package 34006 1726882664.25040: worker is 1 (out of 1 available) 34006 1726882664.25054: exiting _queue_task() for managed_node3/package 34006 1726882664.25064: done queuing things up, now waiting for results queue to drain 34006 1726882664.25066: waiting for pending results... 34006 1726882664.25215: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 34006 1726882664.25280: in run() - task 12673a56-9f93-11ce-7734-00000000006b 34006 1726882664.25296: variable 'ansible_search_path' from source: unknown 34006 1726882664.25300: variable 'ansible_search_path' from source: unknown 34006 1726882664.25327: calling self._execute() 34006 1726882664.25381: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.25385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.25402: variable 'omit' from source: magic vars 34006 1726882664.25639: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.25648: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.25728: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.25732: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.25735: when evaluation is False, skipping this task 34006 1726882664.25740: _execute() done 34006 1726882664.25742: dumping result to json 34006 1726882664.25745: done dumping result, returning 34006 1726882664.25748: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-11ce-7734-00000000006b] 34006 1726882664.25750: sending task result for task 12673a56-9f93-11ce-7734-00000000006b 34006 1726882664.25833: done sending task result for task 12673a56-9f93-11ce-7734-00000000006b 34006 1726882664.25836: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.25881: no more pending results, returning what we have 34006 1726882664.25884: results queue empty 34006 1726882664.25885: checking for any_errors_fatal 34006 1726882664.25889: done checking for any_errors_fatal 34006 1726882664.25890: checking for max_fail_percentage 34006 1726882664.25891: done checking for max_fail_percentage 34006 1726882664.25892: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.25895: done checking to see if all hosts have failed 34006 1726882664.25896: getting the remaining hosts for this loop 34006 1726882664.25897: done getting the remaining hosts for this loop 34006 1726882664.25900: getting the next task for host managed_node3 34006 1726882664.25905: done getting next task for host managed_node3 34006 1726882664.25908: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34006 1726882664.25910: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.25923: getting variables 34006 1726882664.25924: in VariableManager get_vars() 34006 1726882664.25962: Calling all_inventory to load vars for managed_node3 34006 1726882664.25964: Calling groups_inventory to load vars for managed_node3 34006 1726882664.25966: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.25971: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.25973: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.25975: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.26081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.26205: done with get_vars() 34006 1726882664.26212: done getting variables 34006 1726882664.26248: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:37:44 -0400 (0:00:00.013) 0:00:04.038 ****** 34006 1726882664.26267: entering _queue_task() for managed_node3/package 34006 1726882664.26429: worker is 1 (out of 1 available) 34006 1726882664.26442: exiting _queue_task() for managed_node3/package 34006 1726882664.26454: done queuing things up, now waiting for results queue to drain 34006 1726882664.26456: waiting for pending results... 34006 1726882664.26618: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34006 1726882664.26682: in run() - task 12673a56-9f93-11ce-7734-00000000006c 34006 1726882664.26696: variable 'ansible_search_path' from source: unknown 34006 1726882664.26700: variable 'ansible_search_path' from source: unknown 34006 1726882664.26731: calling self._execute() 34006 1726882664.26780: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.26786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.26799: variable 'omit' from source: magic vars 34006 1726882664.27036: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.27045: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.27122: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.27126: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.27130: when evaluation is False, skipping this task 34006 1726882664.27133: _execute() done 34006 1726882664.27136: dumping result to json 34006 1726882664.27140: done dumping result, returning 34006 1726882664.27147: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-11ce-7734-00000000006c] 34006 1726882664.27151: sending task result for task 12673a56-9f93-11ce-7734-00000000006c 34006 1726882664.27240: done sending task result for task 12673a56-9f93-11ce-7734-00000000006c 34006 1726882664.27243: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.27286: no more pending results, returning what we have 34006 1726882664.27288: results queue empty 34006 1726882664.27289: checking for any_errors_fatal 34006 1726882664.27296: done checking for any_errors_fatal 34006 1726882664.27296: checking for max_fail_percentage 34006 1726882664.27298: done checking for max_fail_percentage 34006 1726882664.27298: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.27299: done checking to see if all hosts have failed 34006 1726882664.27300: getting the remaining hosts for this loop 34006 1726882664.27301: done getting the remaining hosts for this loop 34006 1726882664.27304: getting the next task for host managed_node3 34006 1726882664.27309: done getting next task for host managed_node3 34006 1726882664.27312: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34006 1726882664.27315: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.27328: getting variables 34006 1726882664.27329: in VariableManager get_vars() 34006 1726882664.27363: Calling all_inventory to load vars for managed_node3 34006 1726882664.27365: Calling groups_inventory to load vars for managed_node3 34006 1726882664.27366: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.27372: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.27373: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.27375: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.27506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.27622: done with get_vars() 34006 1726882664.27628: done getting variables 34006 1726882664.27663: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:37:44 -0400 (0:00:00.014) 0:00:04.053 ****** 34006 1726882664.27685: entering _queue_task() for managed_node3/package 34006 1726882664.27847: worker is 1 (out of 1 available) 34006 1726882664.27859: exiting _queue_task() for managed_node3/package 34006 1726882664.27870: done queuing things up, now waiting for results queue to drain 34006 1726882664.27871: waiting for pending results... 34006 1726882664.28021: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34006 1726882664.28089: in run() - task 12673a56-9f93-11ce-7734-00000000006d 34006 1726882664.28103: variable 'ansible_search_path' from source: unknown 34006 1726882664.28106: variable 'ansible_search_path' from source: unknown 34006 1726882664.28130: calling self._execute() 34006 1726882664.28186: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.28194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.28204: variable 'omit' from source: magic vars 34006 1726882664.28440: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.28448: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.28526: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.28531: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.28534: when evaluation is False, skipping this task 34006 1726882664.28537: _execute() done 34006 1726882664.28540: dumping result to json 34006 1726882664.28542: done dumping result, returning 34006 1726882664.28548: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-11ce-7734-00000000006d] 34006 1726882664.28553: sending task result for task 12673a56-9f93-11ce-7734-00000000006d 34006 1726882664.28643: done sending task result for task 12673a56-9f93-11ce-7734-00000000006d 34006 1726882664.28646: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.28691: no more pending results, returning what we have 34006 1726882664.28703: results queue empty 34006 1726882664.28704: checking for any_errors_fatal 34006 1726882664.28708: done checking for any_errors_fatal 34006 1726882664.28709: checking for max_fail_percentage 34006 1726882664.28710: done checking for max_fail_percentage 34006 1726882664.28711: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.28712: done checking to see if all hosts have failed 34006 1726882664.28712: getting the remaining hosts for this loop 34006 1726882664.28714: done getting the remaining hosts for this loop 34006 1726882664.28716: getting the next task for host managed_node3 34006 1726882664.28721: done getting next task for host managed_node3 34006 1726882664.28724: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34006 1726882664.28727: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.28739: getting variables 34006 1726882664.28740: in VariableManager get_vars() 34006 1726882664.28767: Calling all_inventory to load vars for managed_node3 34006 1726882664.28769: Calling groups_inventory to load vars for managed_node3 34006 1726882664.28770: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.28775: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.28777: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.28778: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.28884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.29008: done with get_vars() 34006 1726882664.29015: done getting variables 34006 1726882664.29052: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:37:44 -0400 (0:00:00.013) 0:00:04.066 ****** 34006 1726882664.29072: entering _queue_task() for managed_node3/service 34006 1726882664.29237: worker is 1 (out of 1 available) 34006 1726882664.29248: exiting _queue_task() for managed_node3/service 34006 1726882664.29260: done queuing things up, now waiting for results queue to drain 34006 1726882664.29262: waiting for pending results... 34006 1726882664.29421: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34006 1726882664.29503: in run() - task 12673a56-9f93-11ce-7734-00000000006e 34006 1726882664.29508: variable 'ansible_search_path' from source: unknown 34006 1726882664.29511: variable 'ansible_search_path' from source: unknown 34006 1726882664.29534: calling self._execute() 34006 1726882664.29598: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.29604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.29607: variable 'omit' from source: magic vars 34006 1726882664.29847: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.29856: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.29935: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.29938: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.29941: when evaluation is False, skipping this task 34006 1726882664.29944: _execute() done 34006 1726882664.29947: dumping result to json 34006 1726882664.29950: done dumping result, returning 34006 1726882664.29961: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-11ce-7734-00000000006e] 34006 1726882664.29963: sending task result for task 12673a56-9f93-11ce-7734-00000000006e 34006 1726882664.30045: done sending task result for task 12673a56-9f93-11ce-7734-00000000006e 34006 1726882664.30047: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.30106: no more pending results, returning what we have 34006 1726882664.30109: results queue empty 34006 1726882664.30110: checking for any_errors_fatal 34006 1726882664.30114: done checking for any_errors_fatal 34006 1726882664.30115: checking for max_fail_percentage 34006 1726882664.30116: done checking for max_fail_percentage 34006 1726882664.30117: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.30118: done checking to see if all hosts have failed 34006 1726882664.30118: getting the remaining hosts for this loop 34006 1726882664.30119: done getting the remaining hosts for this loop 34006 1726882664.30122: getting the next task for host managed_node3 34006 1726882664.30127: done getting next task for host managed_node3 34006 1726882664.30130: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34006 1726882664.30133: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.30146: getting variables 34006 1726882664.30147: in VariableManager get_vars() 34006 1726882664.30182: Calling all_inventory to load vars for managed_node3 34006 1726882664.30184: Calling groups_inventory to load vars for managed_node3 34006 1726882664.30185: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.30195: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.30196: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.30198: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.30329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.30446: done with get_vars() 34006 1726882664.30453: done getting variables 34006 1726882664.30490: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:37:44 -0400 (0:00:00.014) 0:00:04.081 ****** 34006 1726882664.30512: entering _queue_task() for managed_node3/service 34006 1726882664.30670: worker is 1 (out of 1 available) 34006 1726882664.30682: exiting _queue_task() for managed_node3/service 34006 1726882664.30697: done queuing things up, now waiting for results queue to drain 34006 1726882664.30699: waiting for pending results... 34006 1726882664.30939: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34006 1726882664.30944: in run() - task 12673a56-9f93-11ce-7734-00000000006f 34006 1726882664.30947: variable 'ansible_search_path' from source: unknown 34006 1726882664.30949: variable 'ansible_search_path' from source: unknown 34006 1726882664.30952: calling self._execute() 34006 1726882664.31000: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.31004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.31011: variable 'omit' from source: magic vars 34006 1726882664.31246: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.31255: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.31333: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.31337: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.31339: when evaluation is False, skipping this task 34006 1726882664.31344: _execute() done 34006 1726882664.31346: dumping result to json 34006 1726882664.31350: done dumping result, returning 34006 1726882664.31357: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-11ce-7734-00000000006f] 34006 1726882664.31360: sending task result for task 12673a56-9f93-11ce-7734-00000000006f 34006 1726882664.31444: done sending task result for task 12673a56-9f93-11ce-7734-00000000006f 34006 1726882664.31446: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34006 1726882664.31505: no more pending results, returning what we have 34006 1726882664.31507: results queue empty 34006 1726882664.31508: checking for any_errors_fatal 34006 1726882664.31514: done checking for any_errors_fatal 34006 1726882664.31515: checking for max_fail_percentage 34006 1726882664.31516: done checking for max_fail_percentage 34006 1726882664.31517: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.31518: done checking to see if all hosts have failed 34006 1726882664.31518: getting the remaining hosts for this loop 34006 1726882664.31520: done getting the remaining hosts for this loop 34006 1726882664.31522: getting the next task for host managed_node3 34006 1726882664.31527: done getting next task for host managed_node3 34006 1726882664.31530: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34006 1726882664.31533: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.31544: getting variables 34006 1726882664.31545: in VariableManager get_vars() 34006 1726882664.31575: Calling all_inventory to load vars for managed_node3 34006 1726882664.31577: Calling groups_inventory to load vars for managed_node3 34006 1726882664.31578: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.31583: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.31585: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.31587: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.31695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.31817: done with get_vars() 34006 1726882664.31823: done getting variables 34006 1726882664.31858: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:37:44 -0400 (0:00:00.013) 0:00:04.095 ****** 34006 1726882664.31878: entering _queue_task() for managed_node3/service 34006 1726882664.32044: worker is 1 (out of 1 available) 34006 1726882664.32056: exiting _queue_task() for managed_node3/service 34006 1726882664.32068: done queuing things up, now waiting for results queue to drain 34006 1726882664.32070: waiting for pending results... 34006 1726882664.32214: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34006 1726882664.32279: in run() - task 12673a56-9f93-11ce-7734-000000000070 34006 1726882664.32295: variable 'ansible_search_path' from source: unknown 34006 1726882664.32298: variable 'ansible_search_path' from source: unknown 34006 1726882664.32321: calling self._execute() 34006 1726882664.32375: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.32379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.32387: variable 'omit' from source: magic vars 34006 1726882664.32617: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.32627: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.32703: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.32706: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.32710: when evaluation is False, skipping this task 34006 1726882664.32713: _execute() done 34006 1726882664.32715: dumping result to json 34006 1726882664.32719: done dumping result, returning 34006 1726882664.32726: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-11ce-7734-000000000070] 34006 1726882664.32730: sending task result for task 12673a56-9f93-11ce-7734-000000000070 34006 1726882664.32816: done sending task result for task 12673a56-9f93-11ce-7734-000000000070 34006 1726882664.32818: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.32880: no more pending results, returning what we have 34006 1726882664.32883: results queue empty 34006 1726882664.32884: checking for any_errors_fatal 34006 1726882664.32891: done checking for any_errors_fatal 34006 1726882664.32891: checking for max_fail_percentage 34006 1726882664.32895: done checking for max_fail_percentage 34006 1726882664.32896: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.32897: done checking to see if all hosts have failed 34006 1726882664.32898: getting the remaining hosts for this loop 34006 1726882664.32899: done getting the remaining hosts for this loop 34006 1726882664.32902: getting the next task for host managed_node3 34006 1726882664.32906: done getting next task for host managed_node3 34006 1726882664.32908: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34006 1726882664.32910: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.32919: getting variables 34006 1726882664.32920: in VariableManager get_vars() 34006 1726882664.32949: Calling all_inventory to load vars for managed_node3 34006 1726882664.32951: Calling groups_inventory to load vars for managed_node3 34006 1726882664.32952: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.32959: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.32961: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.32963: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.33097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.33216: done with get_vars() 34006 1726882664.33222: done getting variables 34006 1726882664.33258: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:37:44 -0400 (0:00:00.013) 0:00:04.109 ****** 34006 1726882664.33278: entering _queue_task() for managed_node3/service 34006 1726882664.33441: worker is 1 (out of 1 available) 34006 1726882664.33454: exiting _queue_task() for managed_node3/service 34006 1726882664.33465: done queuing things up, now waiting for results queue to drain 34006 1726882664.33467: waiting for pending results... 34006 1726882664.33607: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 34006 1726882664.33675: in run() - task 12673a56-9f93-11ce-7734-000000000071 34006 1726882664.33685: variable 'ansible_search_path' from source: unknown 34006 1726882664.33692: variable 'ansible_search_path' from source: unknown 34006 1726882664.33720: calling self._execute() 34006 1726882664.33771: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.33774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.33782: variable 'omit' from source: magic vars 34006 1726882664.34010: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.34022: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.34098: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.34103: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.34106: when evaluation is False, skipping this task 34006 1726882664.34109: _execute() done 34006 1726882664.34112: dumping result to json 34006 1726882664.34115: done dumping result, returning 34006 1726882664.34121: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-11ce-7734-000000000071] 34006 1726882664.34126: sending task result for task 12673a56-9f93-11ce-7734-000000000071 34006 1726882664.34209: done sending task result for task 12673a56-9f93-11ce-7734-000000000071 34006 1726882664.34212: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34006 1726882664.34338: no more pending results, returning what we have 34006 1726882664.34341: results queue empty 34006 1726882664.34342: checking for any_errors_fatal 34006 1726882664.34351: done checking for any_errors_fatal 34006 1726882664.34352: checking for max_fail_percentage 34006 1726882664.34354: done checking for max_fail_percentage 34006 1726882664.34354: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.34355: done checking to see if all hosts have failed 34006 1726882664.34358: getting the remaining hosts for this loop 34006 1726882664.34359: done getting the remaining hosts for this loop 34006 1726882664.34364: getting the next task for host managed_node3 34006 1726882664.34387: done getting next task for host managed_node3 34006 1726882664.34397: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34006 1726882664.34401: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.34418: getting variables 34006 1726882664.34420: in VariableManager get_vars() 34006 1726882664.34465: Calling all_inventory to load vars for managed_node3 34006 1726882664.34468: Calling groups_inventory to load vars for managed_node3 34006 1726882664.34470: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.34478: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.34481: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.34484: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.34602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.34719: done with get_vars() 34006 1726882664.34725: done getting variables 34006 1726882664.34762: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:37:44 -0400 (0:00:00.015) 0:00:04.124 ****** 34006 1726882664.34781: entering _queue_task() for managed_node3/copy 34006 1726882664.34940: worker is 1 (out of 1 available) 34006 1726882664.34951: exiting _queue_task() for managed_node3/copy 34006 1726882664.34961: done queuing things up, now waiting for results queue to drain 34006 1726882664.34962: waiting for pending results... 34006 1726882664.35123: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34006 1726882664.35200: in run() - task 12673a56-9f93-11ce-7734-000000000072 34006 1726882664.35206: variable 'ansible_search_path' from source: unknown 34006 1726882664.35209: variable 'ansible_search_path' from source: unknown 34006 1726882664.35238: calling self._execute() 34006 1726882664.35290: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.35304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.35308: variable 'omit' from source: magic vars 34006 1726882664.35549: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.35557: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.35636: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.35639: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.35642: when evaluation is False, skipping this task 34006 1726882664.35645: _execute() done 34006 1726882664.35647: dumping result to json 34006 1726882664.35650: done dumping result, returning 34006 1726882664.35659: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-11ce-7734-000000000072] 34006 1726882664.35662: sending task result for task 12673a56-9f93-11ce-7734-000000000072 34006 1726882664.35742: done sending task result for task 12673a56-9f93-11ce-7734-000000000072 34006 1726882664.35745: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.35788: no more pending results, returning what we have 34006 1726882664.35791: results queue empty 34006 1726882664.35792: checking for any_errors_fatal 34006 1726882664.35799: done checking for any_errors_fatal 34006 1726882664.35799: checking for max_fail_percentage 34006 1726882664.35801: done checking for max_fail_percentage 34006 1726882664.35801: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.35802: done checking to see if all hosts have failed 34006 1726882664.35803: getting the remaining hosts for this loop 34006 1726882664.35804: done getting the remaining hosts for this loop 34006 1726882664.35807: getting the next task for host managed_node3 34006 1726882664.35812: done getting next task for host managed_node3 34006 1726882664.35815: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34006 1726882664.35818: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.35831: getting variables 34006 1726882664.35832: in VariableManager get_vars() 34006 1726882664.35872: Calling all_inventory to load vars for managed_node3 34006 1726882664.35874: Calling groups_inventory to load vars for managed_node3 34006 1726882664.35875: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.35881: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.35882: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.35884: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.36022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.36139: done with get_vars() 34006 1726882664.36146: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:37:44 -0400 (0:00:00.014) 0:00:04.138 ****** 34006 1726882664.36209: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 34006 1726882664.36486: worker is 1 (out of 1 available) 34006 1726882664.36500: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 34006 1726882664.36510: done queuing things up, now waiting for results queue to drain 34006 1726882664.36512: waiting for pending results... 34006 1726882664.36733: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34006 1726882664.36833: in run() - task 12673a56-9f93-11ce-7734-000000000073 34006 1726882664.36836: variable 'ansible_search_path' from source: unknown 34006 1726882664.36839: variable 'ansible_search_path' from source: unknown 34006 1726882664.36841: calling self._execute() 34006 1726882664.36910: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.36921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.36933: variable 'omit' from source: magic vars 34006 1726882664.37252: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.37269: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.37450: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.37453: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.37456: when evaluation is False, skipping this task 34006 1726882664.37458: _execute() done 34006 1726882664.37460: dumping result to json 34006 1726882664.37462: done dumping result, returning 34006 1726882664.37464: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-11ce-7734-000000000073] 34006 1726882664.37467: sending task result for task 12673a56-9f93-11ce-7734-000000000073 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.37601: no more pending results, returning what we have 34006 1726882664.37605: results queue empty 34006 1726882664.37607: checking for any_errors_fatal 34006 1726882664.37612: done checking for any_errors_fatal 34006 1726882664.37613: checking for max_fail_percentage 34006 1726882664.37615: done checking for max_fail_percentage 34006 1726882664.37615: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.37616: done checking to see if all hosts have failed 34006 1726882664.37617: getting the remaining hosts for this loop 34006 1726882664.37619: done getting the remaining hosts for this loop 34006 1726882664.37622: getting the next task for host managed_node3 34006 1726882664.37628: done getting next task for host managed_node3 34006 1726882664.37631: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34006 1726882664.37635: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.37649: getting variables 34006 1726882664.37651: in VariableManager get_vars() 34006 1726882664.37697: Calling all_inventory to load vars for managed_node3 34006 1726882664.37700: Calling groups_inventory to load vars for managed_node3 34006 1726882664.37703: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.37713: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.37716: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.37719: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.38033: done sending task result for task 12673a56-9f93-11ce-7734-000000000073 34006 1726882664.38037: WORKER PROCESS EXITING 34006 1726882664.38057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.38264: done with get_vars() 34006 1726882664.38273: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:37:44 -0400 (0:00:00.021) 0:00:04.159 ****** 34006 1726882664.38346: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 34006 1726882664.38542: worker is 1 (out of 1 available) 34006 1726882664.38553: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 34006 1726882664.38564: done queuing things up, now waiting for results queue to drain 34006 1726882664.38565: waiting for pending results... 34006 1726882664.38792: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 34006 1726882664.38907: in run() - task 12673a56-9f93-11ce-7734-000000000074 34006 1726882664.38929: variable 'ansible_search_path' from source: unknown 34006 1726882664.38937: variable 'ansible_search_path' from source: unknown 34006 1726882664.38972: calling self._execute() 34006 1726882664.39054: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.39067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.39083: variable 'omit' from source: magic vars 34006 1726882664.39407: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.39423: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.39540: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.39551: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.39562: when evaluation is False, skipping this task 34006 1726882664.39571: _execute() done 34006 1726882664.39578: dumping result to json 34006 1726882664.39586: done dumping result, returning 34006 1726882664.39604: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-11ce-7734-000000000074] 34006 1726882664.39616: sending task result for task 12673a56-9f93-11ce-7734-000000000074 34006 1726882664.39718: done sending task result for task 12673a56-9f93-11ce-7734-000000000074 34006 1726882664.39721: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.39764: no more pending results, returning what we have 34006 1726882664.39767: results queue empty 34006 1726882664.39768: checking for any_errors_fatal 34006 1726882664.39774: done checking for any_errors_fatal 34006 1726882664.39775: checking for max_fail_percentage 34006 1726882664.39777: done checking for max_fail_percentage 34006 1726882664.39778: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.39778: done checking to see if all hosts have failed 34006 1726882664.39779: getting the remaining hosts for this loop 34006 1726882664.39781: done getting the remaining hosts for this loop 34006 1726882664.39784: getting the next task for host managed_node3 34006 1726882664.39792: done getting next task for host managed_node3 34006 1726882664.39798: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34006 1726882664.39801: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.39818: getting variables 34006 1726882664.39820: in VariableManager get_vars() 34006 1726882664.39860: Calling all_inventory to load vars for managed_node3 34006 1726882664.39863: Calling groups_inventory to load vars for managed_node3 34006 1726882664.39865: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.39872: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.39874: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.39877: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.40027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.40144: done with get_vars() 34006 1726882664.40153: done getting variables 34006 1726882664.40191: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:37:44 -0400 (0:00:00.018) 0:00:04.178 ****** 34006 1726882664.40212: entering _queue_task() for managed_node3/debug 34006 1726882664.40370: worker is 1 (out of 1 available) 34006 1726882664.40384: exiting _queue_task() for managed_node3/debug 34006 1726882664.40399: done queuing things up, now waiting for results queue to drain 34006 1726882664.40401: waiting for pending results... 34006 1726882664.40542: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34006 1726882664.40614: in run() - task 12673a56-9f93-11ce-7734-000000000075 34006 1726882664.40627: variable 'ansible_search_path' from source: unknown 34006 1726882664.40630: variable 'ansible_search_path' from source: unknown 34006 1726882664.40654: calling self._execute() 34006 1726882664.40708: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.40712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.40720: variable 'omit' from source: magic vars 34006 1726882664.40950: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.40957: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.41034: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.41038: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.41041: when evaluation is False, skipping this task 34006 1726882664.41043: _execute() done 34006 1726882664.41048: dumping result to json 34006 1726882664.41050: done dumping result, returning 34006 1726882664.41058: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-11ce-7734-000000000075] 34006 1726882664.41063: sending task result for task 12673a56-9f93-11ce-7734-000000000075 34006 1726882664.41139: done sending task result for task 12673a56-9f93-11ce-7734-000000000075 34006 1726882664.41143: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882664.41201: no more pending results, returning what we have 34006 1726882664.41203: results queue empty 34006 1726882664.41204: checking for any_errors_fatal 34006 1726882664.41208: done checking for any_errors_fatal 34006 1726882664.41209: checking for max_fail_percentage 34006 1726882664.41210: done checking for max_fail_percentage 34006 1726882664.41210: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.41211: done checking to see if all hosts have failed 34006 1726882664.41212: getting the remaining hosts for this loop 34006 1726882664.41213: done getting the remaining hosts for this loop 34006 1726882664.41217: getting the next task for host managed_node3 34006 1726882664.41222: done getting next task for host managed_node3 34006 1726882664.41225: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34006 1726882664.41228: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.41240: getting variables 34006 1726882664.41241: in VariableManager get_vars() 34006 1726882664.41271: Calling all_inventory to load vars for managed_node3 34006 1726882664.41273: Calling groups_inventory to load vars for managed_node3 34006 1726882664.41274: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.41280: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.41281: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.41283: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.41385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.41531: done with get_vars() 34006 1726882664.41540: done getting variables 34006 1726882664.41591: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:37:44 -0400 (0:00:00.014) 0:00:04.192 ****** 34006 1726882664.41620: entering _queue_task() for managed_node3/debug 34006 1726882664.41825: worker is 1 (out of 1 available) 34006 1726882664.41837: exiting _queue_task() for managed_node3/debug 34006 1726882664.41848: done queuing things up, now waiting for results queue to drain 34006 1726882664.41849: waiting for pending results... 34006 1726882664.42107: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34006 1726882664.42299: in run() - task 12673a56-9f93-11ce-7734-000000000076 34006 1726882664.42304: variable 'ansible_search_path' from source: unknown 34006 1726882664.42307: variable 'ansible_search_path' from source: unknown 34006 1726882664.42310: calling self._execute() 34006 1726882664.42368: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.42380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.42404: variable 'omit' from source: magic vars 34006 1726882664.42750: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.42768: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.42968: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.42971: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.42974: when evaluation is False, skipping this task 34006 1726882664.42977: _execute() done 34006 1726882664.42979: dumping result to json 34006 1726882664.42981: done dumping result, returning 34006 1726882664.42984: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-11ce-7734-000000000076] 34006 1726882664.42986: sending task result for task 12673a56-9f93-11ce-7734-000000000076 34006 1726882664.43050: done sending task result for task 12673a56-9f93-11ce-7734-000000000076 34006 1726882664.43052: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882664.43117: no more pending results, returning what we have 34006 1726882664.43120: results queue empty 34006 1726882664.43122: checking for any_errors_fatal 34006 1726882664.43126: done checking for any_errors_fatal 34006 1726882664.43127: checking for max_fail_percentage 34006 1726882664.43129: done checking for max_fail_percentage 34006 1726882664.43130: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.43130: done checking to see if all hosts have failed 34006 1726882664.43131: getting the remaining hosts for this loop 34006 1726882664.43133: done getting the remaining hosts for this loop 34006 1726882664.43136: getting the next task for host managed_node3 34006 1726882664.43142: done getting next task for host managed_node3 34006 1726882664.43146: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34006 1726882664.43149: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.43165: getting variables 34006 1726882664.43167: in VariableManager get_vars() 34006 1726882664.43216: Calling all_inventory to load vars for managed_node3 34006 1726882664.43219: Calling groups_inventory to load vars for managed_node3 34006 1726882664.43221: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.43232: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.43235: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.43237: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.43596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.43801: done with get_vars() 34006 1726882664.43809: done getting variables 34006 1726882664.43859: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:37:44 -0400 (0:00:00.022) 0:00:04.215 ****** 34006 1726882664.43885: entering _queue_task() for managed_node3/debug 34006 1726882664.44095: worker is 1 (out of 1 available) 34006 1726882664.44106: exiting _queue_task() for managed_node3/debug 34006 1726882664.44116: done queuing things up, now waiting for results queue to drain 34006 1726882664.44118: waiting for pending results... 34006 1726882664.44511: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34006 1726882664.44515: in run() - task 12673a56-9f93-11ce-7734-000000000077 34006 1726882664.44519: variable 'ansible_search_path' from source: unknown 34006 1726882664.44521: variable 'ansible_search_path' from source: unknown 34006 1726882664.44524: calling self._execute() 34006 1726882664.44762: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.45025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.45029: variable 'omit' from source: magic vars 34006 1726882664.45512: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.45582: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.45744: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.45759: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.45767: when evaluation is False, skipping this task 34006 1726882664.45781: _execute() done 34006 1726882664.45795: dumping result to json 34006 1726882664.45804: done dumping result, returning 34006 1726882664.45815: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-11ce-7734-000000000077] 34006 1726882664.45845: sending task result for task 12673a56-9f93-11ce-7734-000000000077 skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882664.46068: no more pending results, returning what we have 34006 1726882664.46072: results queue empty 34006 1726882664.46073: checking for any_errors_fatal 34006 1726882664.46080: done checking for any_errors_fatal 34006 1726882664.46081: checking for max_fail_percentage 34006 1726882664.46083: done checking for max_fail_percentage 34006 1726882664.46083: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.46084: done checking to see if all hosts have failed 34006 1726882664.46085: getting the remaining hosts for this loop 34006 1726882664.46086: done getting the remaining hosts for this loop 34006 1726882664.46092: getting the next task for host managed_node3 34006 1726882664.46100: done getting next task for host managed_node3 34006 1726882664.46104: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34006 1726882664.46108: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.46123: getting variables 34006 1726882664.46125: in VariableManager get_vars() 34006 1726882664.46169: Calling all_inventory to load vars for managed_node3 34006 1726882664.46172: Calling groups_inventory to load vars for managed_node3 34006 1726882664.46174: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.46185: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.46190: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.46284: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.46299: done sending task result for task 12673a56-9f93-11ce-7734-000000000077 34006 1726882664.46302: WORKER PROCESS EXITING 34006 1726882664.46462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.46668: done with get_vars() 34006 1726882664.46677: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:37:44 -0400 (0:00:00.028) 0:00:04.243 ****** 34006 1726882664.46764: entering _queue_task() for managed_node3/ping 34006 1726882664.46967: worker is 1 (out of 1 available) 34006 1726882664.46978: exiting _queue_task() for managed_node3/ping 34006 1726882664.46991: done queuing things up, now waiting for results queue to drain 34006 1726882664.46995: waiting for pending results... 34006 1726882664.47234: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 34006 1726882664.47353: in run() - task 12673a56-9f93-11ce-7734-000000000078 34006 1726882664.47374: variable 'ansible_search_path' from source: unknown 34006 1726882664.47381: variable 'ansible_search_path' from source: unknown 34006 1726882664.47424: calling self._execute() 34006 1726882664.47499: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.47510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.47526: variable 'omit' from source: magic vars 34006 1726882664.47886: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.47997: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.48022: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.48033: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.48040: when evaluation is False, skipping this task 34006 1726882664.48047: _execute() done 34006 1726882664.48053: dumping result to json 34006 1726882664.48061: done dumping result, returning 34006 1726882664.48070: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-11ce-7734-000000000078] 34006 1726882664.48079: sending task result for task 12673a56-9f93-11ce-7734-000000000078 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.48257: no more pending results, returning what we have 34006 1726882664.48260: results queue empty 34006 1726882664.48261: checking for any_errors_fatal 34006 1726882664.48267: done checking for any_errors_fatal 34006 1726882664.48268: checking for max_fail_percentage 34006 1726882664.48270: done checking for max_fail_percentage 34006 1726882664.48270: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.48271: done checking to see if all hosts have failed 34006 1726882664.48272: getting the remaining hosts for this loop 34006 1726882664.48273: done getting the remaining hosts for this loop 34006 1726882664.48277: getting the next task for host managed_node3 34006 1726882664.48286: done getting next task for host managed_node3 34006 1726882664.48291: ^ task is: TASK: meta (role_complete) 34006 1726882664.48296: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.48311: getting variables 34006 1726882664.48313: in VariableManager get_vars() 34006 1726882664.48354: Calling all_inventory to load vars for managed_node3 34006 1726882664.48356: Calling groups_inventory to load vars for managed_node3 34006 1726882664.48358: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.48367: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.48369: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.48371: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.48702: done sending task result for task 12673a56-9f93-11ce-7734-000000000078 34006 1726882664.48705: WORKER PROCESS EXITING 34006 1726882664.48726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.48921: done with get_vars() 34006 1726882664.48931: done getting variables 34006 1726882664.49001: done queuing things up, now waiting for results queue to drain 34006 1726882664.49003: results queue empty 34006 1726882664.49004: checking for any_errors_fatal 34006 1726882664.49005: done checking for any_errors_fatal 34006 1726882664.49006: checking for max_fail_percentage 34006 1726882664.49007: done checking for max_fail_percentage 34006 1726882664.49008: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.49008: done checking to see if all hosts have failed 34006 1726882664.49009: getting the remaining hosts for this loop 34006 1726882664.49010: done getting the remaining hosts for this loop 34006 1726882664.49012: getting the next task for host managed_node3 34006 1726882664.49015: done getting next task for host managed_node3 34006 1726882664.49017: ^ task is: TASK: TEST: wireless connection with 802.1x TLS-EAP 34006 1726882664.49019: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.49020: getting variables 34006 1726882664.49021: in VariableManager get_vars() 34006 1726882664.49037: Calling all_inventory to load vars for managed_node3 34006 1726882664.49040: Calling groups_inventory to load vars for managed_node3 34006 1726882664.49041: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.49046: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.49048: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.49051: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.49175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.49355: done with get_vars() 34006 1726882664.49363: done getting variables 34006 1726882664.49403: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: wireless connection with 802.1x TLS-EAP] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:53 Friday 20 September 2024 21:37:44 -0400 (0:00:00.026) 0:00:04.270 ****** 34006 1726882664.49426: entering _queue_task() for managed_node3/debug 34006 1726882664.49661: worker is 1 (out of 1 available) 34006 1726882664.49674: exiting _queue_task() for managed_node3/debug 34006 1726882664.49685: done queuing things up, now waiting for results queue to drain 34006 1726882664.49686: waiting for pending results... 34006 1726882664.50144: running TaskExecutor() for managed_node3/TASK: TEST: wireless connection with 802.1x TLS-EAP 34006 1726882664.50283: in run() - task 12673a56-9f93-11ce-7734-0000000000a8 34006 1726882664.50416: variable 'ansible_search_path' from source: unknown 34006 1726882664.50456: calling self._execute() 34006 1726882664.50575: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.50654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.50668: variable 'omit' from source: magic vars 34006 1726882664.51246: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.51266: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.51386: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.51403: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.51412: when evaluation is False, skipping this task 34006 1726882664.51420: _execute() done 34006 1726882664.51427: dumping result to json 34006 1726882664.51436: done dumping result, returning 34006 1726882664.51449: done running TaskExecutor() for managed_node3/TASK: TEST: wireless connection with 802.1x TLS-EAP [12673a56-9f93-11ce-7734-0000000000a8] 34006 1726882664.51459: sending task result for task 12673a56-9f93-11ce-7734-0000000000a8 34006 1726882664.51710: done sending task result for task 12673a56-9f93-11ce-7734-0000000000a8 34006 1726882664.51715: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882664.51751: no more pending results, returning what we have 34006 1726882664.51753: results queue empty 34006 1726882664.51754: checking for any_errors_fatal 34006 1726882664.51756: done checking for any_errors_fatal 34006 1726882664.51756: checking for max_fail_percentage 34006 1726882664.51758: done checking for max_fail_percentage 34006 1726882664.51759: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.51760: done checking to see if all hosts have failed 34006 1726882664.51760: getting the remaining hosts for this loop 34006 1726882664.51762: done getting the remaining hosts for this loop 34006 1726882664.51765: getting the next task for host managed_node3 34006 1726882664.51771: done getting next task for host managed_node3 34006 1726882664.51775: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34006 1726882664.51778: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.51799: getting variables 34006 1726882664.51800: in VariableManager get_vars() 34006 1726882664.51838: Calling all_inventory to load vars for managed_node3 34006 1726882664.51841: Calling groups_inventory to load vars for managed_node3 34006 1726882664.51843: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.51851: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.51854: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.51857: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.52328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.52522: done with get_vars() 34006 1726882664.52531: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:37:44 -0400 (0:00:00.033) 0:00:04.304 ****** 34006 1726882664.52821: entering _queue_task() for managed_node3/include_tasks 34006 1726882664.53140: worker is 1 (out of 1 available) 34006 1726882664.53151: exiting _queue_task() for managed_node3/include_tasks 34006 1726882664.53163: done queuing things up, now waiting for results queue to drain 34006 1726882664.53165: waiting for pending results... 34006 1726882664.53537: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34006 1726882664.53727: in run() - task 12673a56-9f93-11ce-7734-0000000000b0 34006 1726882664.53732: variable 'ansible_search_path' from source: unknown 34006 1726882664.53735: variable 'ansible_search_path' from source: unknown 34006 1726882664.53784: calling self._execute() 34006 1726882664.53870: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.53882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.53925: variable 'omit' from source: magic vars 34006 1726882664.54264: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.54351: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.54400: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.54411: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.54419: when evaluation is False, skipping this task 34006 1726882664.54425: _execute() done 34006 1726882664.54431: dumping result to json 34006 1726882664.54438: done dumping result, returning 34006 1726882664.54448: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-11ce-7734-0000000000b0] 34006 1726882664.54464: sending task result for task 12673a56-9f93-11ce-7734-0000000000b0 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.54606: no more pending results, returning what we have 34006 1726882664.54609: results queue empty 34006 1726882664.54610: checking for any_errors_fatal 34006 1726882664.54618: done checking for any_errors_fatal 34006 1726882664.54619: checking for max_fail_percentage 34006 1726882664.54621: done checking for max_fail_percentage 34006 1726882664.54622: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.54622: done checking to see if all hosts have failed 34006 1726882664.54623: getting the remaining hosts for this loop 34006 1726882664.54624: done getting the remaining hosts for this loop 34006 1726882664.54628: getting the next task for host managed_node3 34006 1726882664.54634: done getting next task for host managed_node3 34006 1726882664.54638: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34006 1726882664.54640: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.54657: getting variables 34006 1726882664.54658: in VariableManager get_vars() 34006 1726882664.54702: Calling all_inventory to load vars for managed_node3 34006 1726882664.54705: Calling groups_inventory to load vars for managed_node3 34006 1726882664.54707: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.54715: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.54717: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.54720: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.54867: done sending task result for task 12673a56-9f93-11ce-7734-0000000000b0 34006 1726882664.54871: WORKER PROCESS EXITING 34006 1726882664.54896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.55110: done with get_vars() 34006 1726882664.55120: done getting variables 34006 1726882664.55172: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:37:44 -0400 (0:00:00.023) 0:00:04.328 ****** 34006 1726882664.55204: entering _queue_task() for managed_node3/debug 34006 1726882664.55423: worker is 1 (out of 1 available) 34006 1726882664.55436: exiting _queue_task() for managed_node3/debug 34006 1726882664.55447: done queuing things up, now waiting for results queue to drain 34006 1726882664.55448: waiting for pending results... 34006 1726882664.55747: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 34006 1726882664.55905: in run() - task 12673a56-9f93-11ce-7734-0000000000b1 34006 1726882664.56018: variable 'ansible_search_path' from source: unknown 34006 1726882664.56026: variable 'ansible_search_path' from source: unknown 34006 1726882664.56137: calling self._execute() 34006 1726882664.56232: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.56505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.56508: variable 'omit' from source: magic vars 34006 1726882664.57052: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.57080: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.57214: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.57227: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.57236: when evaluation is False, skipping this task 34006 1726882664.57243: _execute() done 34006 1726882664.57251: dumping result to json 34006 1726882664.57263: done dumping result, returning 34006 1726882664.57274: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-11ce-7734-0000000000b1] 34006 1726882664.57286: sending task result for task 12673a56-9f93-11ce-7734-0000000000b1 skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882664.57432: no more pending results, returning what we have 34006 1726882664.57436: results queue empty 34006 1726882664.57437: checking for any_errors_fatal 34006 1726882664.57442: done checking for any_errors_fatal 34006 1726882664.57444: checking for max_fail_percentage 34006 1726882664.57445: done checking for max_fail_percentage 34006 1726882664.57446: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.57447: done checking to see if all hosts have failed 34006 1726882664.57448: getting the remaining hosts for this loop 34006 1726882664.57449: done getting the remaining hosts for this loop 34006 1726882664.57453: getting the next task for host managed_node3 34006 1726882664.57460: done getting next task for host managed_node3 34006 1726882664.57464: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34006 1726882664.57467: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.57488: getting variables 34006 1726882664.57490: in VariableManager get_vars() 34006 1726882664.57538: Calling all_inventory to load vars for managed_node3 34006 1726882664.57541: Calling groups_inventory to load vars for managed_node3 34006 1726882664.57544: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.57554: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.57558: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.57561: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.58300: done sending task result for task 12673a56-9f93-11ce-7734-0000000000b1 34006 1726882664.58304: WORKER PROCESS EXITING 34006 1726882664.58327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.58627: done with get_vars() 34006 1726882664.58636: done getting variables 34006 1726882664.58797: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:37:44 -0400 (0:00:00.036) 0:00:04.364 ****** 34006 1726882664.58829: entering _queue_task() for managed_node3/fail 34006 1726882664.59710: worker is 1 (out of 1 available) 34006 1726882664.59718: exiting _queue_task() for managed_node3/fail 34006 1726882664.59728: done queuing things up, now waiting for results queue to drain 34006 1726882664.59729: waiting for pending results... 34006 1726882664.59785: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34006 1726882664.60025: in run() - task 12673a56-9f93-11ce-7734-0000000000b2 34006 1726882664.60115: variable 'ansible_search_path' from source: unknown 34006 1726882664.60180: variable 'ansible_search_path' from source: unknown 34006 1726882664.60222: calling self._execute() 34006 1726882664.60362: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.60600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.60605: variable 'omit' from source: magic vars 34006 1726882664.61302: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.61385: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.61615: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.61627: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.61634: when evaluation is False, skipping this task 34006 1726882664.61642: _execute() done 34006 1726882664.61648: dumping result to json 34006 1726882664.61654: done dumping result, returning 34006 1726882664.61666: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-11ce-7734-0000000000b2] 34006 1726882664.61708: sending task result for task 12673a56-9f93-11ce-7734-0000000000b2 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.61960: no more pending results, returning what we have 34006 1726882664.61964: results queue empty 34006 1726882664.61965: checking for any_errors_fatal 34006 1726882664.61973: done checking for any_errors_fatal 34006 1726882664.61974: checking for max_fail_percentage 34006 1726882664.61976: done checking for max_fail_percentage 34006 1726882664.61977: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.61977: done checking to see if all hosts have failed 34006 1726882664.61978: getting the remaining hosts for this loop 34006 1726882664.61980: done getting the remaining hosts for this loop 34006 1726882664.61983: getting the next task for host managed_node3 34006 1726882664.61990: done getting next task for host managed_node3 34006 1726882664.61995: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34006 1726882664.61998: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.62017: getting variables 34006 1726882664.62019: in VariableManager get_vars() 34006 1726882664.62063: Calling all_inventory to load vars for managed_node3 34006 1726882664.62066: Calling groups_inventory to load vars for managed_node3 34006 1726882664.62069: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.62079: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.62082: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.62085: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.62818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.63029: done with get_vars() 34006 1726882664.63038: done getting variables 34006 1726882664.63300: done sending task result for task 12673a56-9f93-11ce-7734-0000000000b2 34006 1726882664.63304: WORKER PROCESS EXITING 34006 1726882664.63338: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:37:44 -0400 (0:00:00.045) 0:00:04.409 ****** 34006 1726882664.63368: entering _queue_task() for managed_node3/fail 34006 1726882664.63678: worker is 1 (out of 1 available) 34006 1726882664.63691: exiting _queue_task() for managed_node3/fail 34006 1726882664.63903: done queuing things up, now waiting for results queue to drain 34006 1726882664.63905: waiting for pending results... 34006 1726882664.64134: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34006 1726882664.64499: in run() - task 12673a56-9f93-11ce-7734-0000000000b3 34006 1726882664.64503: variable 'ansible_search_path' from source: unknown 34006 1726882664.64506: variable 'ansible_search_path' from source: unknown 34006 1726882664.64570: calling self._execute() 34006 1726882664.64748: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.64764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.64777: variable 'omit' from source: magic vars 34006 1726882664.65477: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.65616: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.65767: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.65853: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.65860: when evaluation is False, skipping this task 34006 1726882664.65867: _execute() done 34006 1726882664.65873: dumping result to json 34006 1726882664.65878: done dumping result, returning 34006 1726882664.65887: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-11ce-7734-0000000000b3] 34006 1726882664.65898: sending task result for task 12673a56-9f93-11ce-7734-0000000000b3 34006 1726882664.66245: done sending task result for task 12673a56-9f93-11ce-7734-0000000000b3 34006 1726882664.66248: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.66296: no more pending results, returning what we have 34006 1726882664.66300: results queue empty 34006 1726882664.66301: checking for any_errors_fatal 34006 1726882664.66307: done checking for any_errors_fatal 34006 1726882664.66308: checking for max_fail_percentage 34006 1726882664.66310: done checking for max_fail_percentage 34006 1726882664.66311: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.66312: done checking to see if all hosts have failed 34006 1726882664.66312: getting the remaining hosts for this loop 34006 1726882664.66314: done getting the remaining hosts for this loop 34006 1726882664.66317: getting the next task for host managed_node3 34006 1726882664.66325: done getting next task for host managed_node3 34006 1726882664.66328: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34006 1726882664.66331: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.66351: getting variables 34006 1726882664.66353: in VariableManager get_vars() 34006 1726882664.66404: Calling all_inventory to load vars for managed_node3 34006 1726882664.66407: Calling groups_inventory to load vars for managed_node3 34006 1726882664.66410: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.66423: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.66426: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.66430: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.67156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.67427: done with get_vars() 34006 1726882664.67438: done getting variables 34006 1726882664.67514: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:37:44 -0400 (0:00:00.041) 0:00:04.451 ****** 34006 1726882664.67546: entering _queue_task() for managed_node3/fail 34006 1726882664.67829: worker is 1 (out of 1 available) 34006 1726882664.67841: exiting _queue_task() for managed_node3/fail 34006 1726882664.67878: done queuing things up, now waiting for results queue to drain 34006 1726882664.67895: waiting for pending results... 34006 1726882664.68137: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34006 1726882664.68264: in run() - task 12673a56-9f93-11ce-7734-0000000000b4 34006 1726882664.68284: variable 'ansible_search_path' from source: unknown 34006 1726882664.68292: variable 'ansible_search_path' from source: unknown 34006 1726882664.68330: calling self._execute() 34006 1726882664.68421: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.68431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.68445: variable 'omit' from source: magic vars 34006 1726882664.69099: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.69103: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.69148: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.69158: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.69498: when evaluation is False, skipping this task 34006 1726882664.69501: _execute() done 34006 1726882664.69504: dumping result to json 34006 1726882664.69506: done dumping result, returning 34006 1726882664.69509: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-11ce-7734-0000000000b4] 34006 1726882664.69511: sending task result for task 12673a56-9f93-11ce-7734-0000000000b4 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.69610: no more pending results, returning what we have 34006 1726882664.69612: results queue empty 34006 1726882664.69613: checking for any_errors_fatal 34006 1726882664.69618: done checking for any_errors_fatal 34006 1726882664.69619: checking for max_fail_percentage 34006 1726882664.69620: done checking for max_fail_percentage 34006 1726882664.69621: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.69622: done checking to see if all hosts have failed 34006 1726882664.69623: getting the remaining hosts for this loop 34006 1726882664.69624: done getting the remaining hosts for this loop 34006 1726882664.69627: getting the next task for host managed_node3 34006 1726882664.69632: done getting next task for host managed_node3 34006 1726882664.69635: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34006 1726882664.69637: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.69652: getting variables 34006 1726882664.69654: in VariableManager get_vars() 34006 1726882664.69698: Calling all_inventory to load vars for managed_node3 34006 1726882664.69701: Calling groups_inventory to load vars for managed_node3 34006 1726882664.69703: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.69710: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.69712: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.69715: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.69963: done sending task result for task 12673a56-9f93-11ce-7734-0000000000b4 34006 1726882664.69967: WORKER PROCESS EXITING 34006 1726882664.69980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.70220: done with get_vars() 34006 1726882664.70236: done getting variables 34006 1726882664.70291: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:37:44 -0400 (0:00:00.027) 0:00:04.479 ****** 34006 1726882664.70320: entering _queue_task() for managed_node3/dnf 34006 1726882664.70598: worker is 1 (out of 1 available) 34006 1726882664.70609: exiting _queue_task() for managed_node3/dnf 34006 1726882664.70619: done queuing things up, now waiting for results queue to drain 34006 1726882664.70621: waiting for pending results... 34006 1726882664.70857: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34006 1726882664.70979: in run() - task 12673a56-9f93-11ce-7734-0000000000b5 34006 1726882664.71012: variable 'ansible_search_path' from source: unknown 34006 1726882664.71019: variable 'ansible_search_path' from source: unknown 34006 1726882664.71053: calling self._execute() 34006 1726882664.71142: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.71153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.71166: variable 'omit' from source: magic vars 34006 1726882664.71532: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.71557: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.71682: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.71698: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.71705: when evaluation is False, skipping this task 34006 1726882664.71711: _execute() done 34006 1726882664.71718: dumping result to json 34006 1726882664.71724: done dumping result, returning 34006 1726882664.71734: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-11ce-7734-0000000000b5] 34006 1726882664.71742: sending task result for task 12673a56-9f93-11ce-7734-0000000000b5 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.71911: no more pending results, returning what we have 34006 1726882664.71915: results queue empty 34006 1726882664.71915: checking for any_errors_fatal 34006 1726882664.71921: done checking for any_errors_fatal 34006 1726882664.71922: checking for max_fail_percentage 34006 1726882664.71924: done checking for max_fail_percentage 34006 1726882664.71925: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.71925: done checking to see if all hosts have failed 34006 1726882664.71926: getting the remaining hosts for this loop 34006 1726882664.71928: done getting the remaining hosts for this loop 34006 1726882664.71931: getting the next task for host managed_node3 34006 1726882664.71939: done getting next task for host managed_node3 34006 1726882664.71942: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34006 1726882664.71945: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.71963: getting variables 34006 1726882664.71965: in VariableManager get_vars() 34006 1726882664.72013: Calling all_inventory to load vars for managed_node3 34006 1726882664.72017: Calling groups_inventory to load vars for managed_node3 34006 1726882664.72020: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.72031: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.72034: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.72037: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.72409: done sending task result for task 12673a56-9f93-11ce-7734-0000000000b5 34006 1726882664.72412: WORKER PROCESS EXITING 34006 1726882664.72445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.72664: done with get_vars() 34006 1726882664.72674: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34006 1726882664.72758: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:37:44 -0400 (0:00:00.024) 0:00:04.504 ****** 34006 1726882664.72785: entering _queue_task() for managed_node3/yum 34006 1726882664.73211: worker is 1 (out of 1 available) 34006 1726882664.73222: exiting _queue_task() for managed_node3/yum 34006 1726882664.73232: done queuing things up, now waiting for results queue to drain 34006 1726882664.73234: waiting for pending results... 34006 1726882664.73398: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34006 1726882664.73552: in run() - task 12673a56-9f93-11ce-7734-0000000000b6 34006 1726882664.73576: variable 'ansible_search_path' from source: unknown 34006 1726882664.73585: variable 'ansible_search_path' from source: unknown 34006 1726882664.73663: calling self._execute() 34006 1726882664.73843: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.73863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.73880: variable 'omit' from source: magic vars 34006 1726882664.74242: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.74259: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.74404: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.74502: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.74506: when evaluation is False, skipping this task 34006 1726882664.74511: _execute() done 34006 1726882664.74513: dumping result to json 34006 1726882664.74515: done dumping result, returning 34006 1726882664.74517: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-11ce-7734-0000000000b6] 34006 1726882664.74519: sending task result for task 12673a56-9f93-11ce-7734-0000000000b6 34006 1726882664.74591: done sending task result for task 12673a56-9f93-11ce-7734-0000000000b6 34006 1726882664.74597: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.74655: no more pending results, returning what we have 34006 1726882664.74659: results queue empty 34006 1726882664.74660: checking for any_errors_fatal 34006 1726882664.74665: done checking for any_errors_fatal 34006 1726882664.74666: checking for max_fail_percentage 34006 1726882664.74668: done checking for max_fail_percentage 34006 1726882664.74669: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.74670: done checking to see if all hosts have failed 34006 1726882664.74670: getting the remaining hosts for this loop 34006 1726882664.74672: done getting the remaining hosts for this loop 34006 1726882664.74675: getting the next task for host managed_node3 34006 1726882664.74682: done getting next task for host managed_node3 34006 1726882664.74686: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34006 1726882664.74691: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.74717: getting variables 34006 1726882664.74719: in VariableManager get_vars() 34006 1726882664.74763: Calling all_inventory to load vars for managed_node3 34006 1726882664.74766: Calling groups_inventory to load vars for managed_node3 34006 1726882664.74768: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.74778: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.74781: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.74784: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.75132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.75366: done with get_vars() 34006 1726882664.75375: done getting variables 34006 1726882664.75437: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:37:44 -0400 (0:00:00.026) 0:00:04.530 ****** 34006 1726882664.75464: entering _queue_task() for managed_node3/fail 34006 1726882664.75700: worker is 1 (out of 1 available) 34006 1726882664.75714: exiting _queue_task() for managed_node3/fail 34006 1726882664.75725: done queuing things up, now waiting for results queue to drain 34006 1726882664.75726: waiting for pending results... 34006 1726882664.76109: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34006 1726882664.76113: in run() - task 12673a56-9f93-11ce-7734-0000000000b7 34006 1726882664.76127: variable 'ansible_search_path' from source: unknown 34006 1726882664.76153: variable 'ansible_search_path' from source: unknown 34006 1726882664.76178: calling self._execute() 34006 1726882664.76265: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.76298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.76301: variable 'omit' from source: magic vars 34006 1726882664.76649: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.76664: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.76812: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.76816: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.76818: when evaluation is False, skipping this task 34006 1726882664.76821: _execute() done 34006 1726882664.76828: dumping result to json 34006 1726882664.76920: done dumping result, returning 34006 1726882664.76924: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-11ce-7734-0000000000b7] 34006 1726882664.76927: sending task result for task 12673a56-9f93-11ce-7734-0000000000b7 34006 1726882664.76997: done sending task result for task 12673a56-9f93-11ce-7734-0000000000b7 34006 1726882664.77000: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.77068: no more pending results, returning what we have 34006 1726882664.77072: results queue empty 34006 1726882664.77072: checking for any_errors_fatal 34006 1726882664.77077: done checking for any_errors_fatal 34006 1726882664.77078: checking for max_fail_percentage 34006 1726882664.77080: done checking for max_fail_percentage 34006 1726882664.77081: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.77082: done checking to see if all hosts have failed 34006 1726882664.77082: getting the remaining hosts for this loop 34006 1726882664.77084: done getting the remaining hosts for this loop 34006 1726882664.77090: getting the next task for host managed_node3 34006 1726882664.77105: done getting next task for host managed_node3 34006 1726882664.77109: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34006 1726882664.77112: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.77131: getting variables 34006 1726882664.77133: in VariableManager get_vars() 34006 1726882664.77178: Calling all_inventory to load vars for managed_node3 34006 1726882664.77181: Calling groups_inventory to load vars for managed_node3 34006 1726882664.77183: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.77303: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.77306: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.77310: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.77558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.77806: done with get_vars() 34006 1726882664.77816: done getting variables 34006 1726882664.77880: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:37:44 -0400 (0:00:00.024) 0:00:04.555 ****** 34006 1726882664.77916: entering _queue_task() for managed_node3/package 34006 1726882664.78238: worker is 1 (out of 1 available) 34006 1726882664.78251: exiting _queue_task() for managed_node3/package 34006 1726882664.78263: done queuing things up, now waiting for results queue to drain 34006 1726882664.78264: waiting for pending results... 34006 1726882664.78619: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 34006 1726882664.78829: in run() - task 12673a56-9f93-11ce-7734-0000000000b8 34006 1726882664.78833: variable 'ansible_search_path' from source: unknown 34006 1726882664.78836: variable 'ansible_search_path' from source: unknown 34006 1726882664.78840: calling self._execute() 34006 1726882664.78899: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.78912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.78936: variable 'omit' from source: magic vars 34006 1726882664.79332: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.79350: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.79486: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.79507: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.79515: when evaluation is False, skipping this task 34006 1726882664.79523: _execute() done 34006 1726882664.79530: dumping result to json 34006 1726882664.79538: done dumping result, returning 34006 1726882664.79597: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-11ce-7734-0000000000b8] 34006 1726882664.79600: sending task result for task 12673a56-9f93-11ce-7734-0000000000b8 34006 1726882664.79674: done sending task result for task 12673a56-9f93-11ce-7734-0000000000b8 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.79737: no more pending results, returning what we have 34006 1726882664.79741: results queue empty 34006 1726882664.79742: checking for any_errors_fatal 34006 1726882664.79752: done checking for any_errors_fatal 34006 1726882664.79753: checking for max_fail_percentage 34006 1726882664.79755: done checking for max_fail_percentage 34006 1726882664.79756: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.79756: done checking to see if all hosts have failed 34006 1726882664.79757: getting the remaining hosts for this loop 34006 1726882664.79759: done getting the remaining hosts for this loop 34006 1726882664.79762: getting the next task for host managed_node3 34006 1726882664.79770: done getting next task for host managed_node3 34006 1726882664.79774: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34006 1726882664.79777: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.79801: getting variables 34006 1726882664.79805: in VariableManager get_vars() 34006 1726882664.79854: Calling all_inventory to load vars for managed_node3 34006 1726882664.79857: Calling groups_inventory to load vars for managed_node3 34006 1726882664.79860: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.79872: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.79875: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.79878: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.80279: WORKER PROCESS EXITING 34006 1726882664.80304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.80519: done with get_vars() 34006 1726882664.80528: done getting variables 34006 1726882664.80584: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:37:44 -0400 (0:00:00.027) 0:00:04.582 ****** 34006 1726882664.80619: entering _queue_task() for managed_node3/package 34006 1726882664.80984: worker is 1 (out of 1 available) 34006 1726882664.81001: exiting _queue_task() for managed_node3/package 34006 1726882664.81010: done queuing things up, now waiting for results queue to drain 34006 1726882664.81012: waiting for pending results... 34006 1726882664.81230: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34006 1726882664.81408: in run() - task 12673a56-9f93-11ce-7734-0000000000b9 34006 1726882664.81433: variable 'ansible_search_path' from source: unknown 34006 1726882664.81442: variable 'ansible_search_path' from source: unknown 34006 1726882664.81478: calling self._execute() 34006 1726882664.81583: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.81601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.81622: variable 'omit' from source: magic vars 34006 1726882664.82043: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.82114: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.82261: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.82281: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.82304: when evaluation is False, skipping this task 34006 1726882664.82390: _execute() done 34006 1726882664.82395: dumping result to json 34006 1726882664.82398: done dumping result, returning 34006 1726882664.82400: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-11ce-7734-0000000000b9] 34006 1726882664.82404: sending task result for task 12673a56-9f93-11ce-7734-0000000000b9 34006 1726882664.82474: done sending task result for task 12673a56-9f93-11ce-7734-0000000000b9 34006 1726882664.82477: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.82534: no more pending results, returning what we have 34006 1726882664.82538: results queue empty 34006 1726882664.82539: checking for any_errors_fatal 34006 1726882664.82547: done checking for any_errors_fatal 34006 1726882664.82548: checking for max_fail_percentage 34006 1726882664.82549: done checking for max_fail_percentage 34006 1726882664.82550: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.82551: done checking to see if all hosts have failed 34006 1726882664.82551: getting the remaining hosts for this loop 34006 1726882664.82553: done getting the remaining hosts for this loop 34006 1726882664.82557: getting the next task for host managed_node3 34006 1726882664.82563: done getting next task for host managed_node3 34006 1726882664.82567: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34006 1726882664.82571: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.82594: getting variables 34006 1726882664.82597: in VariableManager get_vars() 34006 1726882664.82760: Calling all_inventory to load vars for managed_node3 34006 1726882664.82764: Calling groups_inventory to load vars for managed_node3 34006 1726882664.82766: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.82777: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.82780: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.82905: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.83228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.83513: done with get_vars() 34006 1726882664.83522: done getting variables 34006 1726882664.83590: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:37:44 -0400 (0:00:00.029) 0:00:04.612 ****** 34006 1726882664.83620: entering _queue_task() for managed_node3/package 34006 1726882664.84008: worker is 1 (out of 1 available) 34006 1726882664.84019: exiting _queue_task() for managed_node3/package 34006 1726882664.84029: done queuing things up, now waiting for results queue to drain 34006 1726882664.84030: waiting for pending results... 34006 1726882664.84235: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34006 1726882664.84334: in run() - task 12673a56-9f93-11ce-7734-0000000000ba 34006 1726882664.84338: variable 'ansible_search_path' from source: unknown 34006 1726882664.84341: variable 'ansible_search_path' from source: unknown 34006 1726882664.84362: calling self._execute() 34006 1726882664.84453: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.84464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.84484: variable 'omit' from source: magic vars 34006 1726882664.84858: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.84907: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.85021: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.85031: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.85038: when evaluation is False, skipping this task 34006 1726882664.85044: _execute() done 34006 1726882664.85100: dumping result to json 34006 1726882664.85103: done dumping result, returning 34006 1726882664.85105: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-11ce-7734-0000000000ba] 34006 1726882664.85108: sending task result for task 12673a56-9f93-11ce-7734-0000000000ba skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.85248: no more pending results, returning what we have 34006 1726882664.85252: results queue empty 34006 1726882664.85253: checking for any_errors_fatal 34006 1726882664.85260: done checking for any_errors_fatal 34006 1726882664.85261: checking for max_fail_percentage 34006 1726882664.85263: done checking for max_fail_percentage 34006 1726882664.85264: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.85264: done checking to see if all hosts have failed 34006 1726882664.85265: getting the remaining hosts for this loop 34006 1726882664.85267: done getting the remaining hosts for this loop 34006 1726882664.85270: getting the next task for host managed_node3 34006 1726882664.85278: done getting next task for host managed_node3 34006 1726882664.85282: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34006 1726882664.85286: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.85311: getting variables 34006 1726882664.85312: in VariableManager get_vars() 34006 1726882664.85357: Calling all_inventory to load vars for managed_node3 34006 1726882664.85360: Calling groups_inventory to load vars for managed_node3 34006 1726882664.85362: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.85374: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.85377: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.85380: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.85774: done sending task result for task 12673a56-9f93-11ce-7734-0000000000ba 34006 1726882664.85777: WORKER PROCESS EXITING 34006 1726882664.85803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.86031: done with get_vars() 34006 1726882664.86040: done getting variables 34006 1726882664.86104: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:37:44 -0400 (0:00:00.025) 0:00:04.637 ****** 34006 1726882664.86134: entering _queue_task() for managed_node3/service 34006 1726882664.86504: worker is 1 (out of 1 available) 34006 1726882664.86514: exiting _queue_task() for managed_node3/service 34006 1726882664.86523: done queuing things up, now waiting for results queue to drain 34006 1726882664.86525: waiting for pending results... 34006 1726882664.86712: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34006 1726882664.86844: in run() - task 12673a56-9f93-11ce-7734-0000000000bb 34006 1726882664.86868: variable 'ansible_search_path' from source: unknown 34006 1726882664.86876: variable 'ansible_search_path' from source: unknown 34006 1726882664.86924: calling self._execute() 34006 1726882664.87011: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.87024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.87047: variable 'omit' from source: magic vars 34006 1726882664.87407: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.87427: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.87558: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.87570: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.87586: when evaluation is False, skipping this task 34006 1726882664.87599: _execute() done 34006 1726882664.87607: dumping result to json 34006 1726882664.87629: done dumping result, returning 34006 1726882664.87633: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-11ce-7734-0000000000bb] 34006 1726882664.87635: sending task result for task 12673a56-9f93-11ce-7734-0000000000bb skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.87953: no more pending results, returning what we have 34006 1726882664.87957: results queue empty 34006 1726882664.87958: checking for any_errors_fatal 34006 1726882664.87964: done checking for any_errors_fatal 34006 1726882664.87965: checking for max_fail_percentage 34006 1726882664.87966: done checking for max_fail_percentage 34006 1726882664.87967: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.87968: done checking to see if all hosts have failed 34006 1726882664.87969: getting the remaining hosts for this loop 34006 1726882664.87971: done getting the remaining hosts for this loop 34006 1726882664.87974: getting the next task for host managed_node3 34006 1726882664.87980: done getting next task for host managed_node3 34006 1726882664.87984: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34006 1726882664.87990: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.88066: getting variables 34006 1726882664.88069: in VariableManager get_vars() 34006 1726882664.88117: Calling all_inventory to load vars for managed_node3 34006 1726882664.88197: Calling groups_inventory to load vars for managed_node3 34006 1726882664.88201: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.88207: done sending task result for task 12673a56-9f93-11ce-7734-0000000000bb 34006 1726882664.88210: WORKER PROCESS EXITING 34006 1726882664.88218: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.88221: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.88224: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.88491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.88728: done with get_vars() 34006 1726882664.88738: done getting variables 34006 1726882664.88809: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:37:44 -0400 (0:00:00.027) 0:00:04.664 ****** 34006 1726882664.88837: entering _queue_task() for managed_node3/service 34006 1726882664.89130: worker is 1 (out of 1 available) 34006 1726882664.89144: exiting _queue_task() for managed_node3/service 34006 1726882664.89156: done queuing things up, now waiting for results queue to drain 34006 1726882664.89157: waiting for pending results... 34006 1726882664.89422: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34006 1726882664.89561: in run() - task 12673a56-9f93-11ce-7734-0000000000bc 34006 1726882664.89583: variable 'ansible_search_path' from source: unknown 34006 1726882664.89596: variable 'ansible_search_path' from source: unknown 34006 1726882664.89635: calling self._execute() 34006 1726882664.89735: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.89748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.89870: variable 'omit' from source: magic vars 34006 1726882664.90150: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.90167: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.90302: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.90324: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.90424: when evaluation is False, skipping this task 34006 1726882664.90428: _execute() done 34006 1726882664.90431: dumping result to json 34006 1726882664.90433: done dumping result, returning 34006 1726882664.90436: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-11ce-7734-0000000000bc] 34006 1726882664.90438: sending task result for task 12673a56-9f93-11ce-7734-0000000000bc 34006 1726882664.90510: done sending task result for task 12673a56-9f93-11ce-7734-0000000000bc 34006 1726882664.90514: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34006 1726882664.90568: no more pending results, returning what we have 34006 1726882664.90572: results queue empty 34006 1726882664.90573: checking for any_errors_fatal 34006 1726882664.90581: done checking for any_errors_fatal 34006 1726882664.90582: checking for max_fail_percentage 34006 1726882664.90585: done checking for max_fail_percentage 34006 1726882664.90585: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.90586: done checking to see if all hosts have failed 34006 1726882664.90587: getting the remaining hosts for this loop 34006 1726882664.90591: done getting the remaining hosts for this loop 34006 1726882664.90597: getting the next task for host managed_node3 34006 1726882664.90604: done getting next task for host managed_node3 34006 1726882664.90608: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34006 1726882664.90612: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.90629: getting variables 34006 1726882664.90633: in VariableManager get_vars() 34006 1726882664.90676: Calling all_inventory to load vars for managed_node3 34006 1726882664.90679: Calling groups_inventory to load vars for managed_node3 34006 1726882664.90681: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.90842: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.90846: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.90850: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.91021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.91219: done with get_vars() 34006 1726882664.91235: done getting variables 34006 1726882664.91296: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:37:44 -0400 (0:00:00.024) 0:00:04.689 ****** 34006 1726882664.91324: entering _queue_task() for managed_node3/service 34006 1726882664.91550: worker is 1 (out of 1 available) 34006 1726882664.91675: exiting _queue_task() for managed_node3/service 34006 1726882664.91685: done queuing things up, now waiting for results queue to drain 34006 1726882664.91686: waiting for pending results... 34006 1726882664.91905: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34006 1726882664.91952: in run() - task 12673a56-9f93-11ce-7734-0000000000bd 34006 1726882664.91971: variable 'ansible_search_path' from source: unknown 34006 1726882664.92002: variable 'ansible_search_path' from source: unknown 34006 1726882664.92030: calling self._execute() 34006 1726882664.92113: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.92198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.92201: variable 'omit' from source: magic vars 34006 1726882664.92498: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.92515: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.92631: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.92642: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.92657: when evaluation is False, skipping this task 34006 1726882664.92664: _execute() done 34006 1726882664.92671: dumping result to json 34006 1726882664.92677: done dumping result, returning 34006 1726882664.92685: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-11ce-7734-0000000000bd] 34006 1726882664.92762: sending task result for task 12673a56-9f93-11ce-7734-0000000000bd 34006 1726882664.92833: done sending task result for task 12673a56-9f93-11ce-7734-0000000000bd 34006 1726882664.92835: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.92885: no more pending results, returning what we have 34006 1726882664.92892: results queue empty 34006 1726882664.92895: checking for any_errors_fatal 34006 1726882664.92902: done checking for any_errors_fatal 34006 1726882664.92903: checking for max_fail_percentage 34006 1726882664.92905: done checking for max_fail_percentage 34006 1726882664.92905: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.92906: done checking to see if all hosts have failed 34006 1726882664.92907: getting the remaining hosts for this loop 34006 1726882664.92908: done getting the remaining hosts for this loop 34006 1726882664.92912: getting the next task for host managed_node3 34006 1726882664.92919: done getting next task for host managed_node3 34006 1726882664.92922: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34006 1726882664.92926: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.92943: getting variables 34006 1726882664.92945: in VariableManager get_vars() 34006 1726882664.92994: Calling all_inventory to load vars for managed_node3 34006 1726882664.92997: Calling groups_inventory to load vars for managed_node3 34006 1726882664.93000: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.93012: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.93015: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.93018: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.93402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.93623: done with get_vars() 34006 1726882664.93639: done getting variables 34006 1726882664.93707: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:37:44 -0400 (0:00:00.024) 0:00:04.713 ****** 34006 1726882664.93735: entering _queue_task() for managed_node3/service 34006 1726882664.94109: worker is 1 (out of 1 available) 34006 1726882664.94121: exiting _queue_task() for managed_node3/service 34006 1726882664.94130: done queuing things up, now waiting for results queue to drain 34006 1726882664.94132: waiting for pending results... 34006 1726882664.94435: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 34006 1726882664.94469: in run() - task 12673a56-9f93-11ce-7734-0000000000be 34006 1726882664.94506: variable 'ansible_search_path' from source: unknown 34006 1726882664.94521: variable 'ansible_search_path' from source: unknown 34006 1726882664.94619: calling self._execute() 34006 1726882664.94657: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.94670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.94685: variable 'omit' from source: magic vars 34006 1726882664.95560: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.95803: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.95868: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.95880: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.95891: when evaluation is False, skipping this task 34006 1726882664.95909: _execute() done 34006 1726882664.95924: dumping result to json 34006 1726882664.95982: done dumping result, returning 34006 1726882664.95985: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-11ce-7734-0000000000be] 34006 1726882664.95991: sending task result for task 12673a56-9f93-11ce-7734-0000000000be skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34006 1726882664.96328: no more pending results, returning what we have 34006 1726882664.96332: results queue empty 34006 1726882664.96333: checking for any_errors_fatal 34006 1726882664.96341: done checking for any_errors_fatal 34006 1726882664.96342: checking for max_fail_percentage 34006 1726882664.96344: done checking for max_fail_percentage 34006 1726882664.96345: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.96346: done checking to see if all hosts have failed 34006 1726882664.96346: getting the remaining hosts for this loop 34006 1726882664.96348: done getting the remaining hosts for this loop 34006 1726882664.96351: getting the next task for host managed_node3 34006 1726882664.96363: done getting next task for host managed_node3 34006 1726882664.96367: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34006 1726882664.96370: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.96392: getting variables 34006 1726882664.96396: in VariableManager get_vars() 34006 1726882664.96443: Calling all_inventory to load vars for managed_node3 34006 1726882664.96446: Calling groups_inventory to load vars for managed_node3 34006 1726882664.96448: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.96460: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.96691: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.96698: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.96865: done sending task result for task 12673a56-9f93-11ce-7734-0000000000be 34006 1726882664.96868: WORKER PROCESS EXITING 34006 1726882664.96886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882664.97112: done with get_vars() 34006 1726882664.97127: done getting variables 34006 1726882664.97175: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:37:44 -0400 (0:00:00.034) 0:00:04.748 ****** 34006 1726882664.97205: entering _queue_task() for managed_node3/copy 34006 1726882664.97427: worker is 1 (out of 1 available) 34006 1726882664.97438: exiting _queue_task() for managed_node3/copy 34006 1726882664.97455: done queuing things up, now waiting for results queue to drain 34006 1726882664.97457: waiting for pending results... 34006 1726882664.97750: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34006 1726882664.97922: in run() - task 12673a56-9f93-11ce-7734-0000000000bf 34006 1726882664.97948: variable 'ansible_search_path' from source: unknown 34006 1726882664.97957: variable 'ansible_search_path' from source: unknown 34006 1726882664.98010: calling self._execute() 34006 1726882664.98111: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882664.98127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882664.98145: variable 'omit' from source: magic vars 34006 1726882664.98811: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.98815: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882664.98852: variable 'ansible_distribution_major_version' from source: facts 34006 1726882664.98907: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882664.99099: when evaluation is False, skipping this task 34006 1726882664.99102: _execute() done 34006 1726882664.99105: dumping result to json 34006 1726882664.99107: done dumping result, returning 34006 1726882664.99110: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-11ce-7734-0000000000bf] 34006 1726882664.99113: sending task result for task 12673a56-9f93-11ce-7734-0000000000bf 34006 1726882664.99177: done sending task result for task 12673a56-9f93-11ce-7734-0000000000bf 34006 1726882664.99180: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882664.99230: no more pending results, returning what we have 34006 1726882664.99234: results queue empty 34006 1726882664.99235: checking for any_errors_fatal 34006 1726882664.99243: done checking for any_errors_fatal 34006 1726882664.99244: checking for max_fail_percentage 34006 1726882664.99245: done checking for max_fail_percentage 34006 1726882664.99246: checking to see if all hosts have failed and the running result is not ok 34006 1726882664.99247: done checking to see if all hosts have failed 34006 1726882664.99247: getting the remaining hosts for this loop 34006 1726882664.99249: done getting the remaining hosts for this loop 34006 1726882664.99252: getting the next task for host managed_node3 34006 1726882664.99258: done getting next task for host managed_node3 34006 1726882664.99261: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34006 1726882664.99265: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882664.99281: getting variables 34006 1726882664.99283: in VariableManager get_vars() 34006 1726882664.99329: Calling all_inventory to load vars for managed_node3 34006 1726882664.99332: Calling groups_inventory to load vars for managed_node3 34006 1726882664.99334: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882664.99344: Calling all_plugins_play to load vars for managed_node3 34006 1726882664.99346: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882664.99349: Calling groups_plugins_play to load vars for managed_node3 34006 1726882664.99790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.00046: done with get_vars() 34006 1726882665.00061: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:37:45 -0400 (0:00:00.029) 0:00:04.777 ****** 34006 1726882665.00146: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 34006 1726882665.00370: worker is 1 (out of 1 available) 34006 1726882665.00502: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 34006 1726882665.00512: done queuing things up, now waiting for results queue to drain 34006 1726882665.00513: waiting for pending results... 34006 1726882665.00722: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34006 1726882665.00800: in run() - task 12673a56-9f93-11ce-7734-0000000000c0 34006 1726882665.00828: variable 'ansible_search_path' from source: unknown 34006 1726882665.00837: variable 'ansible_search_path' from source: unknown 34006 1726882665.00879: calling self._execute() 34006 1726882665.00974: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.00986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.01037: variable 'omit' from source: magic vars 34006 1726882665.01372: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.01399: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.01529: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.01584: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.01590: when evaluation is False, skipping this task 34006 1726882665.01595: _execute() done 34006 1726882665.01598: dumping result to json 34006 1726882665.01601: done dumping result, returning 34006 1726882665.01604: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-11ce-7734-0000000000c0] 34006 1726882665.01607: sending task result for task 12673a56-9f93-11ce-7734-0000000000c0 34006 1726882665.01776: done sending task result for task 12673a56-9f93-11ce-7734-0000000000c0 34006 1726882665.01779: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.01945: no more pending results, returning what we have 34006 1726882665.01948: results queue empty 34006 1726882665.01948: checking for any_errors_fatal 34006 1726882665.01954: done checking for any_errors_fatal 34006 1726882665.01955: checking for max_fail_percentage 34006 1726882665.01956: done checking for max_fail_percentage 34006 1726882665.01957: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.01958: done checking to see if all hosts have failed 34006 1726882665.01959: getting the remaining hosts for this loop 34006 1726882665.01960: done getting the remaining hosts for this loop 34006 1726882665.01963: getting the next task for host managed_node3 34006 1726882665.01969: done getting next task for host managed_node3 34006 1726882665.01972: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34006 1726882665.01975: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882665.01996: getting variables 34006 1726882665.01998: in VariableManager get_vars() 34006 1726882665.02041: Calling all_inventory to load vars for managed_node3 34006 1726882665.02044: Calling groups_inventory to load vars for managed_node3 34006 1726882665.02046: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.02057: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.02060: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.02063: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.02330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.02540: done with get_vars() 34006 1726882665.02558: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:37:45 -0400 (0:00:00.025) 0:00:04.802 ****** 34006 1726882665.02663: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 34006 1726882665.03002: worker is 1 (out of 1 available) 34006 1726882665.03012: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 34006 1726882665.03024: done queuing things up, now waiting for results queue to drain 34006 1726882665.03025: waiting for pending results... 34006 1726882665.03314: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 34006 1726882665.03327: in run() - task 12673a56-9f93-11ce-7734-0000000000c1 34006 1726882665.03410: variable 'ansible_search_path' from source: unknown 34006 1726882665.03413: variable 'ansible_search_path' from source: unknown 34006 1726882665.03416: calling self._execute() 34006 1726882665.03481: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.03499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.03522: variable 'omit' from source: magic vars 34006 1726882665.03905: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.03923: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.04064: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.04075: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.04083: when evaluation is False, skipping this task 34006 1726882665.04173: _execute() done 34006 1726882665.04177: dumping result to json 34006 1726882665.04179: done dumping result, returning 34006 1726882665.04182: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-11ce-7734-0000000000c1] 34006 1726882665.04185: sending task result for task 12673a56-9f93-11ce-7734-0000000000c1 34006 1726882665.04258: done sending task result for task 12673a56-9f93-11ce-7734-0000000000c1 34006 1726882665.04261: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.04326: no more pending results, returning what we have 34006 1726882665.04330: results queue empty 34006 1726882665.04330: checking for any_errors_fatal 34006 1726882665.04337: done checking for any_errors_fatal 34006 1726882665.04337: checking for max_fail_percentage 34006 1726882665.04339: done checking for max_fail_percentage 34006 1726882665.04340: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.04341: done checking to see if all hosts have failed 34006 1726882665.04342: getting the remaining hosts for this loop 34006 1726882665.04343: done getting the remaining hosts for this loop 34006 1726882665.04347: getting the next task for host managed_node3 34006 1726882665.04355: done getting next task for host managed_node3 34006 1726882665.04359: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34006 1726882665.04363: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882665.04380: getting variables 34006 1726882665.04382: in VariableManager get_vars() 34006 1726882665.04435: Calling all_inventory to load vars for managed_node3 34006 1726882665.04438: Calling groups_inventory to load vars for managed_node3 34006 1726882665.04441: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.04453: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.04456: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.04459: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.04849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.05043: done with get_vars() 34006 1726882665.05052: done getting variables 34006 1726882665.05107: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:37:45 -0400 (0:00:00.024) 0:00:04.827 ****** 34006 1726882665.05136: entering _queue_task() for managed_node3/debug 34006 1726882665.05352: worker is 1 (out of 1 available) 34006 1726882665.05479: exiting _queue_task() for managed_node3/debug 34006 1726882665.05492: done queuing things up, now waiting for results queue to drain 34006 1726882665.05496: waiting for pending results... 34006 1726882665.05710: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34006 1726882665.05768: in run() - task 12673a56-9f93-11ce-7734-0000000000c2 34006 1726882665.05795: variable 'ansible_search_path' from source: unknown 34006 1726882665.05808: variable 'ansible_search_path' from source: unknown 34006 1726882665.05855: calling self._execute() 34006 1726882665.05999: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.06003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.06006: variable 'omit' from source: magic vars 34006 1726882665.06331: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.06349: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.06466: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.06481: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.06491: when evaluation is False, skipping this task 34006 1726882665.06500: _execute() done 34006 1726882665.06507: dumping result to json 34006 1726882665.06513: done dumping result, returning 34006 1726882665.06523: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-11ce-7734-0000000000c2] 34006 1726882665.06568: sending task result for task 12673a56-9f93-11ce-7734-0000000000c2 34006 1726882665.06630: done sending task result for task 12673a56-9f93-11ce-7734-0000000000c2 34006 1726882665.06632: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882665.06792: no more pending results, returning what we have 34006 1726882665.06797: results queue empty 34006 1726882665.06798: checking for any_errors_fatal 34006 1726882665.06802: done checking for any_errors_fatal 34006 1726882665.06802: checking for max_fail_percentage 34006 1726882665.06804: done checking for max_fail_percentage 34006 1726882665.06804: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.06805: done checking to see if all hosts have failed 34006 1726882665.06806: getting the remaining hosts for this loop 34006 1726882665.06807: done getting the remaining hosts for this loop 34006 1726882665.06810: getting the next task for host managed_node3 34006 1726882665.06816: done getting next task for host managed_node3 34006 1726882665.06820: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34006 1726882665.06822: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882665.06836: getting variables 34006 1726882665.06838: in VariableManager get_vars() 34006 1726882665.06876: Calling all_inventory to load vars for managed_node3 34006 1726882665.06878: Calling groups_inventory to load vars for managed_node3 34006 1726882665.06881: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.06958: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.06961: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.06965: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.07130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.07333: done with get_vars() 34006 1726882665.07342: done getting variables 34006 1726882665.07396: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:37:45 -0400 (0:00:00.022) 0:00:04.850 ****** 34006 1726882665.07422: entering _queue_task() for managed_node3/debug 34006 1726882665.07636: worker is 1 (out of 1 available) 34006 1726882665.07875: exiting _queue_task() for managed_node3/debug 34006 1726882665.07884: done queuing things up, now waiting for results queue to drain 34006 1726882665.07886: waiting for pending results... 34006 1726882665.07911: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34006 1726882665.08033: in run() - task 12673a56-9f93-11ce-7734-0000000000c3 34006 1726882665.08055: variable 'ansible_search_path' from source: unknown 34006 1726882665.08062: variable 'ansible_search_path' from source: unknown 34006 1726882665.08105: calling self._execute() 34006 1726882665.08192: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.08206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.08229: variable 'omit' from source: magic vars 34006 1726882665.08659: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.08663: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.08730: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.08740: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.08747: when evaluation is False, skipping this task 34006 1726882665.08753: _execute() done 34006 1726882665.08766: dumping result to json 34006 1726882665.08773: done dumping result, returning 34006 1726882665.08783: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-11ce-7734-0000000000c3] 34006 1726882665.08797: sending task result for task 12673a56-9f93-11ce-7734-0000000000c3 skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882665.09045: no more pending results, returning what we have 34006 1726882665.09049: results queue empty 34006 1726882665.09051: checking for any_errors_fatal 34006 1726882665.09055: done checking for any_errors_fatal 34006 1726882665.09056: checking for max_fail_percentage 34006 1726882665.09058: done checking for max_fail_percentage 34006 1726882665.09059: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.09060: done checking to see if all hosts have failed 34006 1726882665.09060: getting the remaining hosts for this loop 34006 1726882665.09062: done getting the remaining hosts for this loop 34006 1726882665.09065: getting the next task for host managed_node3 34006 1726882665.09072: done getting next task for host managed_node3 34006 1726882665.09076: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34006 1726882665.09080: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882665.09107: getting variables 34006 1726882665.09109: in VariableManager get_vars() 34006 1726882665.09156: Calling all_inventory to load vars for managed_node3 34006 1726882665.09159: Calling groups_inventory to load vars for managed_node3 34006 1726882665.09161: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.09173: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.09176: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.09179: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.09511: done sending task result for task 12673a56-9f93-11ce-7734-0000000000c3 34006 1726882665.09515: WORKER PROCESS EXITING 34006 1726882665.09541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.09741: done with get_vars() 34006 1726882665.09755: done getting variables 34006 1726882665.09814: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:37:45 -0400 (0:00:00.024) 0:00:04.874 ****** 34006 1726882665.09846: entering _queue_task() for managed_node3/debug 34006 1726882665.10123: worker is 1 (out of 1 available) 34006 1726882665.10135: exiting _queue_task() for managed_node3/debug 34006 1726882665.10146: done queuing things up, now waiting for results queue to drain 34006 1726882665.10147: waiting for pending results... 34006 1726882665.10400: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34006 1726882665.10691: in run() - task 12673a56-9f93-11ce-7734-0000000000c4 34006 1726882665.10696: variable 'ansible_search_path' from source: unknown 34006 1726882665.10699: variable 'ansible_search_path' from source: unknown 34006 1726882665.10706: calling self._execute() 34006 1726882665.10800: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.10818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.10834: variable 'omit' from source: magic vars 34006 1726882665.11250: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.11253: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.11358: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.11361: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.11363: when evaluation is False, skipping this task 34006 1726882665.11366: _execute() done 34006 1726882665.11368: dumping result to json 34006 1726882665.11370: done dumping result, returning 34006 1726882665.11466: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-11ce-7734-0000000000c4] 34006 1726882665.11470: sending task result for task 12673a56-9f93-11ce-7734-0000000000c4 34006 1726882665.11536: done sending task result for task 12673a56-9f93-11ce-7734-0000000000c4 34006 1726882665.11539: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882665.11581: no more pending results, returning what we have 34006 1726882665.11585: results queue empty 34006 1726882665.11586: checking for any_errors_fatal 34006 1726882665.11600: done checking for any_errors_fatal 34006 1726882665.11601: checking for max_fail_percentage 34006 1726882665.11603: done checking for max_fail_percentage 34006 1726882665.11604: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.11605: done checking to see if all hosts have failed 34006 1726882665.11606: getting the remaining hosts for this loop 34006 1726882665.11607: done getting the remaining hosts for this loop 34006 1726882665.11611: getting the next task for host managed_node3 34006 1726882665.11618: done getting next task for host managed_node3 34006 1726882665.11622: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34006 1726882665.11626: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882665.11644: getting variables 34006 1726882665.11645: in VariableManager get_vars() 34006 1726882665.11692: Calling all_inventory to load vars for managed_node3 34006 1726882665.11839: Calling groups_inventory to load vars for managed_node3 34006 1726882665.11842: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.11850: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.11853: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.11855: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.12034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.12241: done with get_vars() 34006 1726882665.12249: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:37:45 -0400 (0:00:00.024) 0:00:04.899 ****** 34006 1726882665.12343: entering _queue_task() for managed_node3/ping 34006 1726882665.12571: worker is 1 (out of 1 available) 34006 1726882665.12583: exiting _queue_task() for managed_node3/ping 34006 1726882665.12709: done queuing things up, now waiting for results queue to drain 34006 1726882665.12711: waiting for pending results... 34006 1726882665.12937: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 34006 1726882665.13035: in run() - task 12673a56-9f93-11ce-7734-0000000000c5 34006 1726882665.13039: variable 'ansible_search_path' from source: unknown 34006 1726882665.13041: variable 'ansible_search_path' from source: unknown 34006 1726882665.13060: calling self._execute() 34006 1726882665.13142: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.13154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.13174: variable 'omit' from source: magic vars 34006 1726882665.13580: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.13584: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.13686: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.13703: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.13714: when evaluation is False, skipping this task 34006 1726882665.13799: _execute() done 34006 1726882665.13802: dumping result to json 34006 1726882665.13804: done dumping result, returning 34006 1726882665.13807: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-11ce-7734-0000000000c5] 34006 1726882665.13809: sending task result for task 12673a56-9f93-11ce-7734-0000000000c5 34006 1726882665.13874: done sending task result for task 12673a56-9f93-11ce-7734-0000000000c5 34006 1726882665.13877: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.13929: no more pending results, returning what we have 34006 1726882665.13933: results queue empty 34006 1726882665.13934: checking for any_errors_fatal 34006 1726882665.13940: done checking for any_errors_fatal 34006 1726882665.13941: checking for max_fail_percentage 34006 1726882665.13943: done checking for max_fail_percentage 34006 1726882665.13944: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.13944: done checking to see if all hosts have failed 34006 1726882665.13945: getting the remaining hosts for this loop 34006 1726882665.13947: done getting the remaining hosts for this loop 34006 1726882665.13950: getting the next task for host managed_node3 34006 1726882665.13959: done getting next task for host managed_node3 34006 1726882665.13961: ^ task is: TASK: meta (role_complete) 34006 1726882665.13965: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882665.13983: getting variables 34006 1726882665.13985: in VariableManager get_vars() 34006 1726882665.14036: Calling all_inventory to load vars for managed_node3 34006 1726882665.14039: Calling groups_inventory to load vars for managed_node3 34006 1726882665.14042: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.14054: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.14056: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.14059: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.14438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.14637: done with get_vars() 34006 1726882665.14646: done getting variables 34006 1726882665.14726: done queuing things up, now waiting for results queue to drain 34006 1726882665.14728: results queue empty 34006 1726882665.14729: checking for any_errors_fatal 34006 1726882665.14730: done checking for any_errors_fatal 34006 1726882665.14731: checking for max_fail_percentage 34006 1726882665.14732: done checking for max_fail_percentage 34006 1726882665.14733: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.14733: done checking to see if all hosts have failed 34006 1726882665.14734: getting the remaining hosts for this loop 34006 1726882665.14735: done getting the remaining hosts for this loop 34006 1726882665.14737: getting the next task for host managed_node3 34006 1726882665.14743: done getting next task for host managed_node3 34006 1726882665.14746: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34006 1726882665.14749: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.14757: getting variables 34006 1726882665.14758: in VariableManager get_vars() 34006 1726882665.14775: Calling all_inventory to load vars for managed_node3 34006 1726882665.14777: Calling groups_inventory to load vars for managed_node3 34006 1726882665.14779: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.14783: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.14785: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.14790: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.14928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.15132: done with get_vars() 34006 1726882665.15140: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:37:45 -0400 (0:00:00.028) 0:00:04.928 ****** 34006 1726882665.15215: entering _queue_task() for managed_node3/include_tasks 34006 1726882665.15578: worker is 1 (out of 1 available) 34006 1726882665.15591: exiting _queue_task() for managed_node3/include_tasks 34006 1726882665.15606: done queuing things up, now waiting for results queue to drain 34006 1726882665.15608: waiting for pending results... 34006 1726882665.15844: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34006 1726882665.15916: in run() - task 12673a56-9f93-11ce-7734-0000000000fd 34006 1726882665.15938: variable 'ansible_search_path' from source: unknown 34006 1726882665.16001: variable 'ansible_search_path' from source: unknown 34006 1726882665.16005: calling self._execute() 34006 1726882665.16066: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.16079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.16098: variable 'omit' from source: magic vars 34006 1726882665.16460: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.16476: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.16607: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.16618: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.16654: when evaluation is False, skipping this task 34006 1726882665.16657: _execute() done 34006 1726882665.16659: dumping result to json 34006 1726882665.16661: done dumping result, returning 34006 1726882665.16664: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-11ce-7734-0000000000fd] 34006 1726882665.16666: sending task result for task 12673a56-9f93-11ce-7734-0000000000fd 34006 1726882665.16829: done sending task result for task 12673a56-9f93-11ce-7734-0000000000fd 34006 1726882665.16832: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.16882: no more pending results, returning what we have 34006 1726882665.16886: results queue empty 34006 1726882665.16890: checking for any_errors_fatal 34006 1726882665.16892: done checking for any_errors_fatal 34006 1726882665.16892: checking for max_fail_percentage 34006 1726882665.16896: done checking for max_fail_percentage 34006 1726882665.16897: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.16898: done checking to see if all hosts have failed 34006 1726882665.16898: getting the remaining hosts for this loop 34006 1726882665.16900: done getting the remaining hosts for this loop 34006 1726882665.16904: getting the next task for host managed_node3 34006 1726882665.16913: done getting next task for host managed_node3 34006 1726882665.16917: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34006 1726882665.16921: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.16940: getting variables 34006 1726882665.16943: in VariableManager get_vars() 34006 1726882665.17138: Calling all_inventory to load vars for managed_node3 34006 1726882665.17141: Calling groups_inventory to load vars for managed_node3 34006 1726882665.17144: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.17152: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.17155: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.17157: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.17585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.17780: done with get_vars() 34006 1726882665.17791: done getting variables 34006 1726882665.17844: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:37:45 -0400 (0:00:00.026) 0:00:04.955 ****** 34006 1726882665.17878: entering _queue_task() for managed_node3/debug 34006 1726882665.18207: worker is 1 (out of 1 available) 34006 1726882665.18218: exiting _queue_task() for managed_node3/debug 34006 1726882665.18226: done queuing things up, now waiting for results queue to drain 34006 1726882665.18228: waiting for pending results... 34006 1726882665.18713: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 34006 1726882665.19000: in run() - task 12673a56-9f93-11ce-7734-0000000000fe 34006 1726882665.19003: variable 'ansible_search_path' from source: unknown 34006 1726882665.19006: variable 'ansible_search_path' from source: unknown 34006 1726882665.19008: calling self._execute() 34006 1726882665.19098: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.19144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.19159: variable 'omit' from source: magic vars 34006 1726882665.19830: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.19848: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.19976: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.19989: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.20000: when evaluation is False, skipping this task 34006 1726882665.20007: _execute() done 34006 1726882665.20020: dumping result to json 34006 1726882665.20028: done dumping result, returning 34006 1726882665.20037: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-11ce-7734-0000000000fe] 34006 1726882665.20124: sending task result for task 12673a56-9f93-11ce-7734-0000000000fe 34006 1726882665.20203: done sending task result for task 12673a56-9f93-11ce-7734-0000000000fe 34006 1726882665.20206: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882665.20273: no more pending results, returning what we have 34006 1726882665.20277: results queue empty 34006 1726882665.20278: checking for any_errors_fatal 34006 1726882665.20285: done checking for any_errors_fatal 34006 1726882665.20286: checking for max_fail_percentage 34006 1726882665.20288: done checking for max_fail_percentage 34006 1726882665.20289: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.20289: done checking to see if all hosts have failed 34006 1726882665.20290: getting the remaining hosts for this loop 34006 1726882665.20292: done getting the remaining hosts for this loop 34006 1726882665.20298: getting the next task for host managed_node3 34006 1726882665.20306: done getting next task for host managed_node3 34006 1726882665.20310: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34006 1726882665.20316: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.20335: getting variables 34006 1726882665.20337: in VariableManager get_vars() 34006 1726882665.20585: Calling all_inventory to load vars for managed_node3 34006 1726882665.20588: Calling groups_inventory to load vars for managed_node3 34006 1726882665.20591: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.20606: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.20609: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.20613: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.20784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.21008: done with get_vars() 34006 1726882665.21019: done getting variables 34006 1726882665.21081: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:37:45 -0400 (0:00:00.032) 0:00:04.987 ****** 34006 1726882665.21114: entering _queue_task() for managed_node3/fail 34006 1726882665.21368: worker is 1 (out of 1 available) 34006 1726882665.21379: exiting _queue_task() for managed_node3/fail 34006 1726882665.21392: done queuing things up, now waiting for results queue to drain 34006 1726882665.21395: waiting for pending results... 34006 1726882665.21981: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34006 1726882665.22017: in run() - task 12673a56-9f93-11ce-7734-0000000000ff 34006 1726882665.22039: variable 'ansible_search_path' from source: unknown 34006 1726882665.22048: variable 'ansible_search_path' from source: unknown 34006 1726882665.22096: calling self._execute() 34006 1726882665.22179: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.22203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.22219: variable 'omit' from source: magic vars 34006 1726882665.22623: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.22734: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.22767: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.22779: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.22790: when evaluation is False, skipping this task 34006 1726882665.22806: _execute() done 34006 1726882665.22815: dumping result to json 34006 1726882665.22823: done dumping result, returning 34006 1726882665.22841: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-11ce-7734-0000000000ff] 34006 1726882665.22844: sending task result for task 12673a56-9f93-11ce-7734-0000000000ff skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.23060: no more pending results, returning what we have 34006 1726882665.23064: results queue empty 34006 1726882665.23065: checking for any_errors_fatal 34006 1726882665.23071: done checking for any_errors_fatal 34006 1726882665.23072: checking for max_fail_percentage 34006 1726882665.23073: done checking for max_fail_percentage 34006 1726882665.23074: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.23075: done checking to see if all hosts have failed 34006 1726882665.23076: getting the remaining hosts for this loop 34006 1726882665.23077: done getting the remaining hosts for this loop 34006 1726882665.23080: getting the next task for host managed_node3 34006 1726882665.23090: done getting next task for host managed_node3 34006 1726882665.23094: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34006 1726882665.23099: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.23118: getting variables 34006 1726882665.23120: in VariableManager get_vars() 34006 1726882665.23163: Calling all_inventory to load vars for managed_node3 34006 1726882665.23166: Calling groups_inventory to load vars for managed_node3 34006 1726882665.23168: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.23178: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.23181: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.23183: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.23564: done sending task result for task 12673a56-9f93-11ce-7734-0000000000ff 34006 1726882665.23566: WORKER PROCESS EXITING 34006 1726882665.23586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.23781: done with get_vars() 34006 1726882665.23796: done getting variables 34006 1726882665.23858: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:37:45 -0400 (0:00:00.027) 0:00:05.015 ****** 34006 1726882665.23894: entering _queue_task() for managed_node3/fail 34006 1726882665.24134: worker is 1 (out of 1 available) 34006 1726882665.24146: exiting _queue_task() for managed_node3/fail 34006 1726882665.24158: done queuing things up, now waiting for results queue to drain 34006 1726882665.24160: waiting for pending results... 34006 1726882665.24484: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34006 1726882665.24677: in run() - task 12673a56-9f93-11ce-7734-000000000100 34006 1726882665.24716: variable 'ansible_search_path' from source: unknown 34006 1726882665.24726: variable 'ansible_search_path' from source: unknown 34006 1726882665.24764: calling self._execute() 34006 1726882665.24934: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.24937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.24940: variable 'omit' from source: magic vars 34006 1726882665.25285: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.25307: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.25434: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.25446: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.25455: when evaluation is False, skipping this task 34006 1726882665.25464: _execute() done 34006 1726882665.25494: dumping result to json 34006 1726882665.25591: done dumping result, returning 34006 1726882665.25597: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-11ce-7734-000000000100] 34006 1726882665.25600: sending task result for task 12673a56-9f93-11ce-7734-000000000100 34006 1726882665.25665: done sending task result for task 12673a56-9f93-11ce-7734-000000000100 34006 1726882665.25668: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.25726: no more pending results, returning what we have 34006 1726882665.25731: results queue empty 34006 1726882665.25732: checking for any_errors_fatal 34006 1726882665.25739: done checking for any_errors_fatal 34006 1726882665.25740: checking for max_fail_percentage 34006 1726882665.25742: done checking for max_fail_percentage 34006 1726882665.25743: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.25744: done checking to see if all hosts have failed 34006 1726882665.25745: getting the remaining hosts for this loop 34006 1726882665.25746: done getting the remaining hosts for this loop 34006 1726882665.25750: getting the next task for host managed_node3 34006 1726882665.25759: done getting next task for host managed_node3 34006 1726882665.25763: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34006 1726882665.25768: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.25790: getting variables 34006 1726882665.25792: in VariableManager get_vars() 34006 1726882665.25840: Calling all_inventory to load vars for managed_node3 34006 1726882665.25843: Calling groups_inventory to load vars for managed_node3 34006 1726882665.25846: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.25857: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.25859: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.25862: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.26285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.26519: done with get_vars() 34006 1726882665.26528: done getting variables 34006 1726882665.26912: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:37:45 -0400 (0:00:00.030) 0:00:05.045 ****** 34006 1726882665.26941: entering _queue_task() for managed_node3/fail 34006 1726882665.27400: worker is 1 (out of 1 available) 34006 1726882665.27407: exiting _queue_task() for managed_node3/fail 34006 1726882665.27416: done queuing things up, now waiting for results queue to drain 34006 1726882665.27418: waiting for pending results... 34006 1726882665.27610: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34006 1726882665.27615: in run() - task 12673a56-9f93-11ce-7734-000000000101 34006 1726882665.27632: variable 'ansible_search_path' from source: unknown 34006 1726882665.27644: variable 'ansible_search_path' from source: unknown 34006 1726882665.27680: calling self._execute() 34006 1726882665.27766: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.27777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.27791: variable 'omit' from source: magic vars 34006 1726882665.28139: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.28186: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.28272: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.28283: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.28299: when evaluation is False, skipping this task 34006 1726882665.28307: _execute() done 34006 1726882665.28400: dumping result to json 34006 1726882665.28404: done dumping result, returning 34006 1726882665.28407: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-11ce-7734-000000000101] 34006 1726882665.28410: sending task result for task 12673a56-9f93-11ce-7734-000000000101 34006 1726882665.28474: done sending task result for task 12673a56-9f93-11ce-7734-000000000101 34006 1726882665.28478: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.28545: no more pending results, returning what we have 34006 1726882665.28548: results queue empty 34006 1726882665.28549: checking for any_errors_fatal 34006 1726882665.28554: done checking for any_errors_fatal 34006 1726882665.28554: checking for max_fail_percentage 34006 1726882665.28556: done checking for max_fail_percentage 34006 1726882665.28557: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.28558: done checking to see if all hosts have failed 34006 1726882665.28559: getting the remaining hosts for this loop 34006 1726882665.28560: done getting the remaining hosts for this loop 34006 1726882665.28563: getting the next task for host managed_node3 34006 1726882665.28571: done getting next task for host managed_node3 34006 1726882665.28574: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34006 1726882665.28578: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.28599: getting variables 34006 1726882665.28601: in VariableManager get_vars() 34006 1726882665.28644: Calling all_inventory to load vars for managed_node3 34006 1726882665.28646: Calling groups_inventory to load vars for managed_node3 34006 1726882665.28649: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.28660: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.28662: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.28665: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.28975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.29206: done with get_vars() 34006 1726882665.29216: done getting variables 34006 1726882665.29266: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:37:45 -0400 (0:00:00.023) 0:00:05.069 ****** 34006 1726882665.29295: entering _queue_task() for managed_node3/dnf 34006 1726882665.29502: worker is 1 (out of 1 available) 34006 1726882665.29513: exiting _queue_task() for managed_node3/dnf 34006 1726882665.29523: done queuing things up, now waiting for results queue to drain 34006 1726882665.29525: waiting for pending results... 34006 1726882665.29910: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34006 1726882665.29915: in run() - task 12673a56-9f93-11ce-7734-000000000102 34006 1726882665.29917: variable 'ansible_search_path' from source: unknown 34006 1726882665.29920: variable 'ansible_search_path' from source: unknown 34006 1726882665.29949: calling self._execute() 34006 1726882665.30035: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.30046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.30060: variable 'omit' from source: magic vars 34006 1726882665.30421: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.30439: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.30559: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.30568: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.30574: when evaluation is False, skipping this task 34006 1726882665.30581: _execute() done 34006 1726882665.30587: dumping result to json 34006 1726882665.30596: done dumping result, returning 34006 1726882665.30606: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-11ce-7734-000000000102] 34006 1726882665.30613: sending task result for task 12673a56-9f93-11ce-7734-000000000102 34006 1726882665.30858: done sending task result for task 12673a56-9f93-11ce-7734-000000000102 34006 1726882665.30861: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.30916: no more pending results, returning what we have 34006 1726882665.30920: results queue empty 34006 1726882665.30921: checking for any_errors_fatal 34006 1726882665.30929: done checking for any_errors_fatal 34006 1726882665.30929: checking for max_fail_percentage 34006 1726882665.30931: done checking for max_fail_percentage 34006 1726882665.30931: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.30932: done checking to see if all hosts have failed 34006 1726882665.30933: getting the remaining hosts for this loop 34006 1726882665.30935: done getting the remaining hosts for this loop 34006 1726882665.30939: getting the next task for host managed_node3 34006 1726882665.30946: done getting next task for host managed_node3 34006 1726882665.30950: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34006 1726882665.30954: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.30974: getting variables 34006 1726882665.30976: in VariableManager get_vars() 34006 1726882665.31030: Calling all_inventory to load vars for managed_node3 34006 1726882665.31033: Calling groups_inventory to load vars for managed_node3 34006 1726882665.31035: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.31046: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.31049: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.31052: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.31334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.31536: done with get_vars() 34006 1726882665.31547: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34006 1726882665.31624: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:37:45 -0400 (0:00:00.023) 0:00:05.092 ****** 34006 1726882665.31655: entering _queue_task() for managed_node3/yum 34006 1726882665.31898: worker is 1 (out of 1 available) 34006 1726882665.31909: exiting _queue_task() for managed_node3/yum 34006 1726882665.31920: done queuing things up, now waiting for results queue to drain 34006 1726882665.31921: waiting for pending results... 34006 1726882665.32421: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34006 1726882665.32679: in run() - task 12673a56-9f93-11ce-7734-000000000103 34006 1726882665.32788: variable 'ansible_search_path' from source: unknown 34006 1726882665.32792: variable 'ansible_search_path' from source: unknown 34006 1726882665.32797: calling self._execute() 34006 1726882665.32923: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.32935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.32949: variable 'omit' from source: magic vars 34006 1726882665.33598: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.33901: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.33936: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.33948: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.33956: when evaluation is False, skipping this task 34006 1726882665.33964: _execute() done 34006 1726882665.33970: dumping result to json 34006 1726882665.34015: done dumping result, returning 34006 1726882665.34028: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-11ce-7734-000000000103] 34006 1726882665.34038: sending task result for task 12673a56-9f93-11ce-7734-000000000103 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.34186: no more pending results, returning what we have 34006 1726882665.34190: results queue empty 34006 1726882665.34190: checking for any_errors_fatal 34006 1726882665.34196: done checking for any_errors_fatal 34006 1726882665.34197: checking for max_fail_percentage 34006 1726882665.34199: done checking for max_fail_percentage 34006 1726882665.34200: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.34201: done checking to see if all hosts have failed 34006 1726882665.34201: getting the remaining hosts for this loop 34006 1726882665.34203: done getting the remaining hosts for this loop 34006 1726882665.34206: getting the next task for host managed_node3 34006 1726882665.34214: done getting next task for host managed_node3 34006 1726882665.34218: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34006 1726882665.34223: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.34241: getting variables 34006 1726882665.34243: in VariableManager get_vars() 34006 1726882665.34290: Calling all_inventory to load vars for managed_node3 34006 1726882665.34595: Calling groups_inventory to load vars for managed_node3 34006 1726882665.34600: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.34610: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.34613: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.34616: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.35133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.35524: done with get_vars() 34006 1726882665.35534: done getting variables 34006 1726882665.35561: done sending task result for task 12673a56-9f93-11ce-7734-000000000103 34006 1726882665.35564: WORKER PROCESS EXITING 34006 1726882665.35601: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:37:45 -0400 (0:00:00.039) 0:00:05.132 ****** 34006 1726882665.35634: entering _queue_task() for managed_node3/fail 34006 1726882665.36077: worker is 1 (out of 1 available) 34006 1726882665.36089: exiting _queue_task() for managed_node3/fail 34006 1726882665.36278: done queuing things up, now waiting for results queue to drain 34006 1726882665.36281: waiting for pending results... 34006 1726882665.36374: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34006 1726882665.36527: in run() - task 12673a56-9f93-11ce-7734-000000000104 34006 1726882665.36553: variable 'ansible_search_path' from source: unknown 34006 1726882665.36562: variable 'ansible_search_path' from source: unknown 34006 1726882665.36605: calling self._execute() 34006 1726882665.36725: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.36755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.36772: variable 'omit' from source: magic vars 34006 1726882665.37240: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.37302: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.37445: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.37482: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.37485: when evaluation is False, skipping this task 34006 1726882665.37490: _execute() done 34006 1726882665.37495: dumping result to json 34006 1726882665.37498: done dumping result, returning 34006 1726882665.37590: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-11ce-7734-000000000104] 34006 1726882665.37600: sending task result for task 12673a56-9f93-11ce-7734-000000000104 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.37784: no more pending results, returning what we have 34006 1726882665.37791: results queue empty 34006 1726882665.37792: checking for any_errors_fatal 34006 1726882665.37800: done checking for any_errors_fatal 34006 1726882665.37801: checking for max_fail_percentage 34006 1726882665.37803: done checking for max_fail_percentage 34006 1726882665.37804: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.37804: done checking to see if all hosts have failed 34006 1726882665.37805: getting the remaining hosts for this loop 34006 1726882665.37807: done getting the remaining hosts for this loop 34006 1726882665.37810: getting the next task for host managed_node3 34006 1726882665.37820: done getting next task for host managed_node3 34006 1726882665.37824: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34006 1726882665.37831: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.37851: getting variables 34006 1726882665.37853: in VariableManager get_vars() 34006 1726882665.38021: Calling all_inventory to load vars for managed_node3 34006 1726882665.38024: Calling groups_inventory to load vars for managed_node3 34006 1726882665.38026: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.38036: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.38039: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.38042: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.38346: done sending task result for task 12673a56-9f93-11ce-7734-000000000104 34006 1726882665.38349: WORKER PROCESS EXITING 34006 1726882665.38372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.38609: done with get_vars() 34006 1726882665.38626: done getting variables 34006 1726882665.38702: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:37:45 -0400 (0:00:00.031) 0:00:05.163 ****** 34006 1726882665.38742: entering _queue_task() for managed_node3/package 34006 1726882665.39109: worker is 1 (out of 1 available) 34006 1726882665.39119: exiting _queue_task() for managed_node3/package 34006 1726882665.39129: done queuing things up, now waiting for results queue to drain 34006 1726882665.39131: waiting for pending results... 34006 1726882665.39413: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 34006 1726882665.39479: in run() - task 12673a56-9f93-11ce-7734-000000000105 34006 1726882665.39505: variable 'ansible_search_path' from source: unknown 34006 1726882665.39519: variable 'ansible_search_path' from source: unknown 34006 1726882665.39560: calling self._execute() 34006 1726882665.39657: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.39670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.39734: variable 'omit' from source: magic vars 34006 1726882665.40113: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.40129: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.40284: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.40303: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.40311: when evaluation is False, skipping this task 34006 1726882665.40318: _execute() done 34006 1726882665.40326: dumping result to json 34006 1726882665.40390: done dumping result, returning 34006 1726882665.40402: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-11ce-7734-000000000105] 34006 1726882665.40405: sending task result for task 12673a56-9f93-11ce-7734-000000000105 34006 1726882665.40476: done sending task result for task 12673a56-9f93-11ce-7734-000000000105 34006 1726882665.40479: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.40532: no more pending results, returning what we have 34006 1726882665.40536: results queue empty 34006 1726882665.40537: checking for any_errors_fatal 34006 1726882665.40549: done checking for any_errors_fatal 34006 1726882665.40550: checking for max_fail_percentage 34006 1726882665.40551: done checking for max_fail_percentage 34006 1726882665.40552: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.40553: done checking to see if all hosts have failed 34006 1726882665.40553: getting the remaining hosts for this loop 34006 1726882665.40555: done getting the remaining hosts for this loop 34006 1726882665.40558: getting the next task for host managed_node3 34006 1726882665.40566: done getting next task for host managed_node3 34006 1726882665.40569: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34006 1726882665.40573: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.40596: getting variables 34006 1726882665.40598: in VariableManager get_vars() 34006 1726882665.40641: Calling all_inventory to load vars for managed_node3 34006 1726882665.40644: Calling groups_inventory to load vars for managed_node3 34006 1726882665.40647: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.40658: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.40661: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.40664: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.41236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.41692: done with get_vars() 34006 1726882665.41703: done getting variables 34006 1726882665.41760: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:37:45 -0400 (0:00:00.030) 0:00:05.194 ****** 34006 1726882665.41796: entering _queue_task() for managed_node3/package 34006 1726882665.42075: worker is 1 (out of 1 available) 34006 1726882665.42090: exiting _queue_task() for managed_node3/package 34006 1726882665.42105: done queuing things up, now waiting for results queue to drain 34006 1726882665.42107: waiting for pending results... 34006 1726882665.42606: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34006 1726882665.42611: in run() - task 12673a56-9f93-11ce-7734-000000000106 34006 1726882665.42614: variable 'ansible_search_path' from source: unknown 34006 1726882665.42616: variable 'ansible_search_path' from source: unknown 34006 1726882665.42619: calling self._execute() 34006 1726882665.42697: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.42713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.42727: variable 'omit' from source: magic vars 34006 1726882665.43098: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.43115: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.43241: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.43254: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.43261: when evaluation is False, skipping this task 34006 1726882665.43268: _execute() done 34006 1726882665.43297: dumping result to json 34006 1726882665.43300: done dumping result, returning 34006 1726882665.43303: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-11ce-7734-000000000106] 34006 1726882665.43306: sending task result for task 12673a56-9f93-11ce-7734-000000000106 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.43564: no more pending results, returning what we have 34006 1726882665.43567: results queue empty 34006 1726882665.43568: checking for any_errors_fatal 34006 1726882665.43576: done checking for any_errors_fatal 34006 1726882665.43576: checking for max_fail_percentage 34006 1726882665.43578: done checking for max_fail_percentage 34006 1726882665.43578: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.43579: done checking to see if all hosts have failed 34006 1726882665.43580: getting the remaining hosts for this loop 34006 1726882665.43581: done getting the remaining hosts for this loop 34006 1726882665.43585: getting the next task for host managed_node3 34006 1726882665.43597: done getting next task for host managed_node3 34006 1726882665.43601: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34006 1726882665.43606: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.43627: getting variables 34006 1726882665.43629: in VariableManager get_vars() 34006 1726882665.43673: Calling all_inventory to load vars for managed_node3 34006 1726882665.43675: Calling groups_inventory to load vars for managed_node3 34006 1726882665.43678: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.43691: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.43797: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.43806: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.43994: done sending task result for task 12673a56-9f93-11ce-7734-000000000106 34006 1726882665.43998: WORKER PROCESS EXITING 34006 1726882665.44025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.44246: done with get_vars() 34006 1726882665.44257: done getting variables 34006 1726882665.44315: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:37:45 -0400 (0:00:00.025) 0:00:05.219 ****** 34006 1726882665.44351: entering _queue_task() for managed_node3/package 34006 1726882665.44700: worker is 1 (out of 1 available) 34006 1726882665.44709: exiting _queue_task() for managed_node3/package 34006 1726882665.44719: done queuing things up, now waiting for results queue to drain 34006 1726882665.44720: waiting for pending results... 34006 1726882665.44876: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34006 1726882665.45029: in run() - task 12673a56-9f93-11ce-7734-000000000107 34006 1726882665.45055: variable 'ansible_search_path' from source: unknown 34006 1726882665.45064: variable 'ansible_search_path' from source: unknown 34006 1726882665.45112: calling self._execute() 34006 1726882665.45216: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.45230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.45245: variable 'omit' from source: magic vars 34006 1726882665.45638: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.45660: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.45786: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.45804: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.45816: when evaluation is False, skipping this task 34006 1726882665.45825: _execute() done 34006 1726882665.45833: dumping result to json 34006 1726882665.45841: done dumping result, returning 34006 1726882665.45851: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-11ce-7734-000000000107] 34006 1726882665.45861: sending task result for task 12673a56-9f93-11ce-7734-000000000107 34006 1726882665.46051: done sending task result for task 12673a56-9f93-11ce-7734-000000000107 34006 1726882665.46054: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.46212: no more pending results, returning what we have 34006 1726882665.46215: results queue empty 34006 1726882665.46216: checking for any_errors_fatal 34006 1726882665.46222: done checking for any_errors_fatal 34006 1726882665.46223: checking for max_fail_percentage 34006 1726882665.46224: done checking for max_fail_percentage 34006 1726882665.46225: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.46226: done checking to see if all hosts have failed 34006 1726882665.46227: getting the remaining hosts for this loop 34006 1726882665.46228: done getting the remaining hosts for this loop 34006 1726882665.46231: getting the next task for host managed_node3 34006 1726882665.46238: done getting next task for host managed_node3 34006 1726882665.46241: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34006 1726882665.46245: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.46261: getting variables 34006 1726882665.46263: in VariableManager get_vars() 34006 1726882665.46351: Calling all_inventory to load vars for managed_node3 34006 1726882665.46354: Calling groups_inventory to load vars for managed_node3 34006 1726882665.46356: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.46364: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.46366: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.46370: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.46577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.46781: done with get_vars() 34006 1726882665.46794: done getting variables 34006 1726882665.46848: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:37:45 -0400 (0:00:00.025) 0:00:05.245 ****** 34006 1726882665.46877: entering _queue_task() for managed_node3/service 34006 1726882665.47116: worker is 1 (out of 1 available) 34006 1726882665.47129: exiting _queue_task() for managed_node3/service 34006 1726882665.47140: done queuing things up, now waiting for results queue to drain 34006 1726882665.47142: waiting for pending results... 34006 1726882665.47520: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34006 1726882665.47562: in run() - task 12673a56-9f93-11ce-7734-000000000108 34006 1726882665.47584: variable 'ansible_search_path' from source: unknown 34006 1726882665.47597: variable 'ansible_search_path' from source: unknown 34006 1726882665.47641: calling self._execute() 34006 1726882665.47734: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.47745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.47758: variable 'omit' from source: magic vars 34006 1726882665.48122: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.48138: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.48256: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.48379: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.48382: when evaluation is False, skipping this task 34006 1726882665.48385: _execute() done 34006 1726882665.48389: dumping result to json 34006 1726882665.48391: done dumping result, returning 34006 1726882665.48395: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-11ce-7734-000000000108] 34006 1726882665.48398: sending task result for task 12673a56-9f93-11ce-7734-000000000108 34006 1726882665.48465: done sending task result for task 12673a56-9f93-11ce-7734-000000000108 34006 1726882665.48468: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.48526: no more pending results, returning what we have 34006 1726882665.48530: results queue empty 34006 1726882665.48531: checking for any_errors_fatal 34006 1726882665.48539: done checking for any_errors_fatal 34006 1726882665.48540: checking for max_fail_percentage 34006 1726882665.48541: done checking for max_fail_percentage 34006 1726882665.48542: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.48543: done checking to see if all hosts have failed 34006 1726882665.48544: getting the remaining hosts for this loop 34006 1726882665.48545: done getting the remaining hosts for this loop 34006 1726882665.48549: getting the next task for host managed_node3 34006 1726882665.48557: done getting next task for host managed_node3 34006 1726882665.48561: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34006 1726882665.48565: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.48584: getting variables 34006 1726882665.48586: in VariableManager get_vars() 34006 1726882665.48634: Calling all_inventory to load vars for managed_node3 34006 1726882665.48637: Calling groups_inventory to load vars for managed_node3 34006 1726882665.48639: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.48650: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.48653: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.48656: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.48942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.49201: done with get_vars() 34006 1726882665.49211: done getting variables 34006 1726882665.49269: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:37:45 -0400 (0:00:00.024) 0:00:05.269 ****** 34006 1726882665.49302: entering _queue_task() for managed_node3/service 34006 1726882665.49530: worker is 1 (out of 1 available) 34006 1726882665.49543: exiting _queue_task() for managed_node3/service 34006 1726882665.49555: done queuing things up, now waiting for results queue to drain 34006 1726882665.49557: waiting for pending results... 34006 1726882665.49824: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34006 1726882665.49999: in run() - task 12673a56-9f93-11ce-7734-000000000109 34006 1726882665.50003: variable 'ansible_search_path' from source: unknown 34006 1726882665.50006: variable 'ansible_search_path' from source: unknown 34006 1726882665.50031: calling self._execute() 34006 1726882665.50114: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.50131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.50143: variable 'omit' from source: magic vars 34006 1726882665.50563: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.50567: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.50734: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.50747: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.50755: when evaluation is False, skipping this task 34006 1726882665.50762: _execute() done 34006 1726882665.50768: dumping result to json 34006 1726882665.50783: done dumping result, returning 34006 1726882665.50799: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-11ce-7734-000000000109] 34006 1726882665.50809: sending task result for task 12673a56-9f93-11ce-7734-000000000109 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34006 1726882665.51059: no more pending results, returning what we have 34006 1726882665.51062: results queue empty 34006 1726882665.51063: checking for any_errors_fatal 34006 1726882665.51069: done checking for any_errors_fatal 34006 1726882665.51070: checking for max_fail_percentage 34006 1726882665.51071: done checking for max_fail_percentage 34006 1726882665.51072: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.51073: done checking to see if all hosts have failed 34006 1726882665.51074: getting the remaining hosts for this loop 34006 1726882665.51076: done getting the remaining hosts for this loop 34006 1726882665.51079: getting the next task for host managed_node3 34006 1726882665.51091: done getting next task for host managed_node3 34006 1726882665.51096: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34006 1726882665.51102: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.51124: getting variables 34006 1726882665.51127: in VariableManager get_vars() 34006 1726882665.51174: Calling all_inventory to load vars for managed_node3 34006 1726882665.51177: Calling groups_inventory to load vars for managed_node3 34006 1726882665.51180: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.51202: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.51205: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.51310: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.51549: done sending task result for task 12673a56-9f93-11ce-7734-000000000109 34006 1726882665.51552: WORKER PROCESS EXITING 34006 1726882665.51574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.51783: done with get_vars() 34006 1726882665.51796: done getting variables 34006 1726882665.51856: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:37:45 -0400 (0:00:00.025) 0:00:05.295 ****** 34006 1726882665.51894: entering _queue_task() for managed_node3/service 34006 1726882665.52133: worker is 1 (out of 1 available) 34006 1726882665.52146: exiting _queue_task() for managed_node3/service 34006 1726882665.52157: done queuing things up, now waiting for results queue to drain 34006 1726882665.52159: waiting for pending results... 34006 1726882665.52426: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34006 1726882665.52558: in run() - task 12673a56-9f93-11ce-7734-00000000010a 34006 1726882665.52577: variable 'ansible_search_path' from source: unknown 34006 1726882665.52584: variable 'ansible_search_path' from source: unknown 34006 1726882665.52633: calling self._execute() 34006 1726882665.52725: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.52737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.52751: variable 'omit' from source: magic vars 34006 1726882665.53115: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.53131: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.53253: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.53270: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.53277: when evaluation is False, skipping this task 34006 1726882665.53284: _execute() done 34006 1726882665.53295: dumping result to json 34006 1726882665.53303: done dumping result, returning 34006 1726882665.53315: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-11ce-7734-00000000010a] 34006 1726882665.53324: sending task result for task 12673a56-9f93-11ce-7734-00000000010a skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.53530: no more pending results, returning what we have 34006 1726882665.53534: results queue empty 34006 1726882665.53535: checking for any_errors_fatal 34006 1726882665.53544: done checking for any_errors_fatal 34006 1726882665.53545: checking for max_fail_percentage 34006 1726882665.53547: done checking for max_fail_percentage 34006 1726882665.53548: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.53548: done checking to see if all hosts have failed 34006 1726882665.53549: getting the remaining hosts for this loop 34006 1726882665.53551: done getting the remaining hosts for this loop 34006 1726882665.53554: getting the next task for host managed_node3 34006 1726882665.53563: done getting next task for host managed_node3 34006 1726882665.53566: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34006 1726882665.53571: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.53754: getting variables 34006 1726882665.53756: in VariableManager get_vars() 34006 1726882665.53799: Calling all_inventory to load vars for managed_node3 34006 1726882665.53801: Calling groups_inventory to load vars for managed_node3 34006 1726882665.53804: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.53810: done sending task result for task 12673a56-9f93-11ce-7734-00000000010a 34006 1726882665.53812: WORKER PROCESS EXITING 34006 1726882665.53820: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.53823: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.53826: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.54005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.54227: done with get_vars() 34006 1726882665.54237: done getting variables 34006 1726882665.54298: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:37:45 -0400 (0:00:00.024) 0:00:05.319 ****** 34006 1726882665.54327: entering _queue_task() for managed_node3/service 34006 1726882665.54550: worker is 1 (out of 1 available) 34006 1726882665.54561: exiting _queue_task() for managed_node3/service 34006 1726882665.54573: done queuing things up, now waiting for results queue to drain 34006 1726882665.54574: waiting for pending results... 34006 1726882665.54969: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 34006 1726882665.55144: in run() - task 12673a56-9f93-11ce-7734-00000000010b 34006 1726882665.55170: variable 'ansible_search_path' from source: unknown 34006 1726882665.55198: variable 'ansible_search_path' from source: unknown 34006 1726882665.55228: calling self._execute() 34006 1726882665.55390: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.55395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.55397: variable 'omit' from source: magic vars 34006 1726882665.55774: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.55798: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.55932: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.55948: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.55955: when evaluation is False, skipping this task 34006 1726882665.55962: _execute() done 34006 1726882665.55969: dumping result to json 34006 1726882665.56000: done dumping result, returning 34006 1726882665.56004: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-11ce-7734-00000000010b] 34006 1726882665.56006: sending task result for task 12673a56-9f93-11ce-7734-00000000010b 34006 1726882665.56121: done sending task result for task 12673a56-9f93-11ce-7734-00000000010b 34006 1726882665.56124: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34006 1726882665.56223: no more pending results, returning what we have 34006 1726882665.56226: results queue empty 34006 1726882665.56227: checking for any_errors_fatal 34006 1726882665.56234: done checking for any_errors_fatal 34006 1726882665.56235: checking for max_fail_percentage 34006 1726882665.56237: done checking for max_fail_percentage 34006 1726882665.56237: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.56238: done checking to see if all hosts have failed 34006 1726882665.56239: getting the remaining hosts for this loop 34006 1726882665.56241: done getting the remaining hosts for this loop 34006 1726882665.56245: getting the next task for host managed_node3 34006 1726882665.56255: done getting next task for host managed_node3 34006 1726882665.56259: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34006 1726882665.56265: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.56286: getting variables 34006 1726882665.56291: in VariableManager get_vars() 34006 1726882665.56352: Calling all_inventory to load vars for managed_node3 34006 1726882665.56355: Calling groups_inventory to load vars for managed_node3 34006 1726882665.56358: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.56370: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.56373: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.56376: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.57217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.57563: done with get_vars() 34006 1726882665.57573: done getting variables 34006 1726882665.57771: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:37:45 -0400 (0:00:00.035) 0:00:05.354 ****** 34006 1726882665.57861: entering _queue_task() for managed_node3/copy 34006 1726882665.58614: worker is 1 (out of 1 available) 34006 1726882665.58624: exiting _queue_task() for managed_node3/copy 34006 1726882665.58634: done queuing things up, now waiting for results queue to drain 34006 1726882665.58635: waiting for pending results... 34006 1726882665.58920: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34006 1726882665.59010: in run() - task 12673a56-9f93-11ce-7734-00000000010c 34006 1726882665.59016: variable 'ansible_search_path' from source: unknown 34006 1726882665.59019: variable 'ansible_search_path' from source: unknown 34006 1726882665.59021: calling self._execute() 34006 1726882665.59116: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.59136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.59202: variable 'omit' from source: magic vars 34006 1726882665.59533: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.59555: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.59686: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.59703: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.59710: when evaluation is False, skipping this task 34006 1726882665.59717: _execute() done 34006 1726882665.59725: dumping result to json 34006 1726882665.59732: done dumping result, returning 34006 1726882665.59744: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-11ce-7734-00000000010c] 34006 1726882665.59786: sending task result for task 12673a56-9f93-11ce-7734-00000000010c skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.60060: no more pending results, returning what we have 34006 1726882665.60063: results queue empty 34006 1726882665.60065: checking for any_errors_fatal 34006 1726882665.60070: done checking for any_errors_fatal 34006 1726882665.60071: checking for max_fail_percentage 34006 1726882665.60073: done checking for max_fail_percentage 34006 1726882665.60074: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.60075: done checking to see if all hosts have failed 34006 1726882665.60075: getting the remaining hosts for this loop 34006 1726882665.60077: done getting the remaining hosts for this loop 34006 1726882665.60081: getting the next task for host managed_node3 34006 1726882665.60109: done getting next task for host managed_node3 34006 1726882665.60114: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34006 1726882665.60118: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.60131: done sending task result for task 12673a56-9f93-11ce-7734-00000000010c 34006 1726882665.60134: WORKER PROCESS EXITING 34006 1726882665.60147: getting variables 34006 1726882665.60149: in VariableManager get_vars() 34006 1726882665.60313: Calling all_inventory to load vars for managed_node3 34006 1726882665.60316: Calling groups_inventory to load vars for managed_node3 34006 1726882665.60319: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.60329: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.60332: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.60335: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.60602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.60847: done with get_vars() 34006 1726882665.60902: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:37:45 -0400 (0:00:00.031) 0:00:05.386 ****** 34006 1726882665.60981: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 34006 1726882665.61318: worker is 1 (out of 1 available) 34006 1726882665.61335: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 34006 1726882665.61347: done queuing things up, now waiting for results queue to drain 34006 1726882665.61348: waiting for pending results... 34006 1726882665.61565: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34006 1726882665.61695: in run() - task 12673a56-9f93-11ce-7734-00000000010d 34006 1726882665.61717: variable 'ansible_search_path' from source: unknown 34006 1726882665.61724: variable 'ansible_search_path' from source: unknown 34006 1726882665.61761: calling self._execute() 34006 1726882665.61853: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.61865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.61890: variable 'omit' from source: magic vars 34006 1726882665.62327: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.62345: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.62498: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.62501: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.62503: when evaluation is False, skipping this task 34006 1726882665.62506: _execute() done 34006 1726882665.62508: dumping result to json 34006 1726882665.62510: done dumping result, returning 34006 1726882665.62512: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-11ce-7734-00000000010d] 34006 1726882665.62514: sending task result for task 12673a56-9f93-11ce-7734-00000000010d 34006 1726882665.62799: done sending task result for task 12673a56-9f93-11ce-7734-00000000010d 34006 1726882665.62802: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.62840: no more pending results, returning what we have 34006 1726882665.62843: results queue empty 34006 1726882665.62844: checking for any_errors_fatal 34006 1726882665.62848: done checking for any_errors_fatal 34006 1726882665.62849: checking for max_fail_percentage 34006 1726882665.62850: done checking for max_fail_percentage 34006 1726882665.62851: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.62852: done checking to see if all hosts have failed 34006 1726882665.62852: getting the remaining hosts for this loop 34006 1726882665.62853: done getting the remaining hosts for this loop 34006 1726882665.62856: getting the next task for host managed_node3 34006 1726882665.62861: done getting next task for host managed_node3 34006 1726882665.62864: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34006 1726882665.62867: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.62882: getting variables 34006 1726882665.62883: in VariableManager get_vars() 34006 1726882665.62924: Calling all_inventory to load vars for managed_node3 34006 1726882665.62927: Calling groups_inventory to load vars for managed_node3 34006 1726882665.62929: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.62937: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.62940: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.62942: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.63187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.63413: done with get_vars() 34006 1726882665.63423: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:37:45 -0400 (0:00:00.025) 0:00:05.411 ****** 34006 1726882665.63507: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 34006 1726882665.63751: worker is 1 (out of 1 available) 34006 1726882665.63875: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 34006 1726882665.63885: done queuing things up, now waiting for results queue to drain 34006 1726882665.63886: waiting for pending results... 34006 1726882665.64055: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 34006 1726882665.64218: in run() - task 12673a56-9f93-11ce-7734-00000000010e 34006 1726882665.64222: variable 'ansible_search_path' from source: unknown 34006 1726882665.64226: variable 'ansible_search_path' from source: unknown 34006 1726882665.64308: calling self._execute() 34006 1726882665.64351: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.64362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.64375: variable 'omit' from source: magic vars 34006 1726882665.64764: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.64782: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.64925: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.64935: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.64942: when evaluation is False, skipping this task 34006 1726882665.64957: _execute() done 34006 1726882665.64960: dumping result to json 34006 1726882665.64971: done dumping result, returning 34006 1726882665.65068: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-11ce-7734-00000000010e] 34006 1726882665.65071: sending task result for task 12673a56-9f93-11ce-7734-00000000010e 34006 1726882665.65156: done sending task result for task 12673a56-9f93-11ce-7734-00000000010e 34006 1726882665.65159: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.65283: no more pending results, returning what we have 34006 1726882665.65300: results queue empty 34006 1726882665.65302: checking for any_errors_fatal 34006 1726882665.65309: done checking for any_errors_fatal 34006 1726882665.65309: checking for max_fail_percentage 34006 1726882665.65311: done checking for max_fail_percentage 34006 1726882665.65312: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.65313: done checking to see if all hosts have failed 34006 1726882665.65314: getting the remaining hosts for this loop 34006 1726882665.65315: done getting the remaining hosts for this loop 34006 1726882665.65319: getting the next task for host managed_node3 34006 1726882665.65327: done getting next task for host managed_node3 34006 1726882665.65348: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34006 1726882665.65354: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.65375: getting variables 34006 1726882665.65377: in VariableManager get_vars() 34006 1726882665.65428: Calling all_inventory to load vars for managed_node3 34006 1726882665.65432: Calling groups_inventory to load vars for managed_node3 34006 1726882665.65434: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.65581: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.65584: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.65591: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.65840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.66060: done with get_vars() 34006 1726882665.66070: done getting variables 34006 1726882665.66129: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:37:45 -0400 (0:00:00.028) 0:00:05.440 ****** 34006 1726882665.66404: entering _queue_task() for managed_node3/debug 34006 1726882665.66949: worker is 1 (out of 1 available) 34006 1726882665.66961: exiting _queue_task() for managed_node3/debug 34006 1726882665.66972: done queuing things up, now waiting for results queue to drain 34006 1726882665.66974: waiting for pending results... 34006 1726882665.67830: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34006 1726882665.68141: in run() - task 12673a56-9f93-11ce-7734-00000000010f 34006 1726882665.68476: variable 'ansible_search_path' from source: unknown 34006 1726882665.68698: variable 'ansible_search_path' from source: unknown 34006 1726882665.68703: calling self._execute() 34006 1726882665.69001: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.69005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.69008: variable 'omit' from source: magic vars 34006 1726882665.69705: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.69895: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.70199: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.70202: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.70205: when evaluation is False, skipping this task 34006 1726882665.70207: _execute() done 34006 1726882665.70209: dumping result to json 34006 1726882665.70211: done dumping result, returning 34006 1726882665.70214: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-11ce-7734-00000000010f] 34006 1726882665.70216: sending task result for task 12673a56-9f93-11ce-7734-00000000010f 34006 1726882665.70290: done sending task result for task 12673a56-9f93-11ce-7734-00000000010f skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882665.70341: no more pending results, returning what we have 34006 1726882665.70346: results queue empty 34006 1726882665.70347: checking for any_errors_fatal 34006 1726882665.70353: done checking for any_errors_fatal 34006 1726882665.70354: checking for max_fail_percentage 34006 1726882665.70356: done checking for max_fail_percentage 34006 1726882665.70357: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.70358: done checking to see if all hosts have failed 34006 1726882665.70358: getting the remaining hosts for this loop 34006 1726882665.70360: done getting the remaining hosts for this loop 34006 1726882665.70364: getting the next task for host managed_node3 34006 1726882665.70372: done getting next task for host managed_node3 34006 1726882665.70375: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34006 1726882665.70380: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.70403: getting variables 34006 1726882665.70405: in VariableManager get_vars() 34006 1726882665.70452: Calling all_inventory to load vars for managed_node3 34006 1726882665.70454: Calling groups_inventory to load vars for managed_node3 34006 1726882665.70457: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.70467: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.70470: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.70472: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.70970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.71986: WORKER PROCESS EXITING 34006 1726882665.72000: done with get_vars() 34006 1726882665.72013: done getting variables 34006 1726882665.72074: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:37:45 -0400 (0:00:00.057) 0:00:05.497 ****** 34006 1726882665.72113: entering _queue_task() for managed_node3/debug 34006 1726882665.73228: worker is 1 (out of 1 available) 34006 1726882665.73237: exiting _queue_task() for managed_node3/debug 34006 1726882665.73247: done queuing things up, now waiting for results queue to drain 34006 1726882665.73249: waiting for pending results... 34006 1726882665.73568: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34006 1726882665.73936: in run() - task 12673a56-9f93-11ce-7734-000000000110 34006 1726882665.74030: variable 'ansible_search_path' from source: unknown 34006 1726882665.74124: variable 'ansible_search_path' from source: unknown 34006 1726882665.74168: calling self._execute() 34006 1726882665.74418: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.74454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.74564: variable 'omit' from source: magic vars 34006 1726882665.75274: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.75299: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.75615: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.75627: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.75635: when evaluation is False, skipping this task 34006 1726882665.75641: _execute() done 34006 1726882665.75648: dumping result to json 34006 1726882665.75655: done dumping result, returning 34006 1726882665.75666: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-11ce-7734-000000000110] 34006 1726882665.75674: sending task result for task 12673a56-9f93-11ce-7734-000000000110 skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882665.75862: no more pending results, returning what we have 34006 1726882665.75866: results queue empty 34006 1726882665.75867: checking for any_errors_fatal 34006 1726882665.75874: done checking for any_errors_fatal 34006 1726882665.75875: checking for max_fail_percentage 34006 1726882665.75877: done checking for max_fail_percentage 34006 1726882665.75878: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.75878: done checking to see if all hosts have failed 34006 1726882665.75879: getting the remaining hosts for this loop 34006 1726882665.75881: done getting the remaining hosts for this loop 34006 1726882665.75885: getting the next task for host managed_node3 34006 1726882665.75898: done getting next task for host managed_node3 34006 1726882665.75902: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34006 1726882665.75908: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.75928: getting variables 34006 1726882665.75930: in VariableManager get_vars() 34006 1726882665.75978: Calling all_inventory to load vars for managed_node3 34006 1726882665.75982: Calling groups_inventory to load vars for managed_node3 34006 1726882665.75984: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.76102: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.76106: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.76110: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.76701: done sending task result for task 12673a56-9f93-11ce-7734-000000000110 34006 1726882665.76705: WORKER PROCESS EXITING 34006 1726882665.76727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.77137: done with get_vars() 34006 1726882665.77147: done getting variables 34006 1726882665.77211: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:37:45 -0400 (0:00:00.051) 0:00:05.548 ****** 34006 1726882665.77243: entering _queue_task() for managed_node3/debug 34006 1726882665.78030: worker is 1 (out of 1 available) 34006 1726882665.78042: exiting _queue_task() for managed_node3/debug 34006 1726882665.78053: done queuing things up, now waiting for results queue to drain 34006 1726882665.78055: waiting for pending results... 34006 1726882665.78214: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34006 1726882665.78704: in run() - task 12673a56-9f93-11ce-7734-000000000111 34006 1726882665.78719: variable 'ansible_search_path' from source: unknown 34006 1726882665.78723: variable 'ansible_search_path' from source: unknown 34006 1726882665.78760: calling self._execute() 34006 1726882665.78961: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.78965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.78976: variable 'omit' from source: magic vars 34006 1726882665.79928: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.79936: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.80163: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.80168: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.80171: when evaluation is False, skipping this task 34006 1726882665.80174: _execute() done 34006 1726882665.80178: dumping result to json 34006 1726882665.80180: done dumping result, returning 34006 1726882665.80194: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-11ce-7734-000000000111] 34006 1726882665.80197: sending task result for task 12673a56-9f93-11ce-7734-000000000111 34006 1726882665.80662: done sending task result for task 12673a56-9f93-11ce-7734-000000000111 34006 1726882665.80665: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "ansible_distribution_major_version == '7'" } 34006 1726882665.80712: no more pending results, returning what we have 34006 1726882665.80715: results queue empty 34006 1726882665.80716: checking for any_errors_fatal 34006 1726882665.80720: done checking for any_errors_fatal 34006 1726882665.80721: checking for max_fail_percentage 34006 1726882665.80722: done checking for max_fail_percentage 34006 1726882665.80723: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.80724: done checking to see if all hosts have failed 34006 1726882665.80725: getting the remaining hosts for this loop 34006 1726882665.80726: done getting the remaining hosts for this loop 34006 1726882665.80729: getting the next task for host managed_node3 34006 1726882665.80736: done getting next task for host managed_node3 34006 1726882665.80739: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34006 1726882665.80744: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.80761: getting variables 34006 1726882665.80763: in VariableManager get_vars() 34006 1726882665.80948: Calling all_inventory to load vars for managed_node3 34006 1726882665.80951: Calling groups_inventory to load vars for managed_node3 34006 1726882665.80953: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.80962: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.80965: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.80968: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.81498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.81998: done with get_vars() 34006 1726882665.82009: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:37:45 -0400 (0:00:00.049) 0:00:05.598 ****** 34006 1726882665.82218: entering _queue_task() for managed_node3/ping 34006 1726882665.82703: worker is 1 (out of 1 available) 34006 1726882665.82716: exiting _queue_task() for managed_node3/ping 34006 1726882665.82728: done queuing things up, now waiting for results queue to drain 34006 1726882665.82729: waiting for pending results... 34006 1726882665.83210: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 34006 1726882665.83599: in run() - task 12673a56-9f93-11ce-7734-000000000112 34006 1726882665.83603: variable 'ansible_search_path' from source: unknown 34006 1726882665.83606: variable 'ansible_search_path' from source: unknown 34006 1726882665.83610: calling self._execute() 34006 1726882665.83728: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.83732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.83958: variable 'omit' from source: magic vars 34006 1726882665.84601: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.84605: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.84713: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.84717: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.84720: when evaluation is False, skipping this task 34006 1726882665.84725: _execute() done 34006 1726882665.84727: dumping result to json 34006 1726882665.84730: done dumping result, returning 34006 1726882665.84732: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-11ce-7734-000000000112] 34006 1726882665.84734: sending task result for task 12673a56-9f93-11ce-7734-000000000112 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.85091: no more pending results, returning what we have 34006 1726882665.85097: results queue empty 34006 1726882665.85099: checking for any_errors_fatal 34006 1726882665.85105: done checking for any_errors_fatal 34006 1726882665.85106: checking for max_fail_percentage 34006 1726882665.85108: done checking for max_fail_percentage 34006 1726882665.85109: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.85109: done checking to see if all hosts have failed 34006 1726882665.85110: getting the remaining hosts for this loop 34006 1726882665.85111: done getting the remaining hosts for this loop 34006 1726882665.85115: getting the next task for host managed_node3 34006 1726882665.85125: done getting next task for host managed_node3 34006 1726882665.85126: ^ task is: TASK: meta (role_complete) 34006 1726882665.85131: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.85149: getting variables 34006 1726882665.85150: in VariableManager get_vars() 34006 1726882665.85190: Calling all_inventory to load vars for managed_node3 34006 1726882665.85194: Calling groups_inventory to load vars for managed_node3 34006 1726882665.85197: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.85207: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.85210: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.85213: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.85419: done sending task result for task 12673a56-9f93-11ce-7734-000000000112 34006 1726882665.85423: WORKER PROCESS EXITING 34006 1726882665.85445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.85656: done with get_vars() 34006 1726882665.85666: done getting variables 34006 1726882665.85752: done queuing things up, now waiting for results queue to drain 34006 1726882665.85754: results queue empty 34006 1726882665.85755: checking for any_errors_fatal 34006 1726882665.85756: done checking for any_errors_fatal 34006 1726882665.85757: checking for max_fail_percentage 34006 1726882665.85758: done checking for max_fail_percentage 34006 1726882665.85759: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.85760: done checking to see if all hosts have failed 34006 1726882665.85760: getting the remaining hosts for this loop 34006 1726882665.85761: done getting the remaining hosts for this loop 34006 1726882665.85764: getting the next task for host managed_node3 34006 1726882665.85767: done getting next task for host managed_node3 34006 1726882665.85770: ^ task is: TASK: Include the task 'cleanup_mock_wifi.yml' 34006 1726882665.85772: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.85774: getting variables 34006 1726882665.85775: in VariableManager get_vars() 34006 1726882665.85796: Calling all_inventory to load vars for managed_node3 34006 1726882665.85799: Calling groups_inventory to load vars for managed_node3 34006 1726882665.85801: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.85805: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.85808: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.85810: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.85954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.86146: done with get_vars() 34006 1726882665.86155: done getting variables TASK [Include the task 'cleanup_mock_wifi.yml'] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:96 Friday 20 September 2024 21:37:45 -0400 (0:00:00.040) 0:00:05.638 ****** 34006 1726882665.86231: entering _queue_task() for managed_node3/include_tasks 34006 1726882665.86512: worker is 1 (out of 1 available) 34006 1726882665.86524: exiting _queue_task() for managed_node3/include_tasks 34006 1726882665.86535: done queuing things up, now waiting for results queue to drain 34006 1726882665.86537: waiting for pending results... 34006 1726882665.86944: running TaskExecutor() for managed_node3/TASK: Include the task 'cleanup_mock_wifi.yml' 34006 1726882665.87168: in run() - task 12673a56-9f93-11ce-7734-000000000142 34006 1726882665.87191: variable 'ansible_search_path' from source: unknown 34006 1726882665.87304: calling self._execute() 34006 1726882665.87591: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.87597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.87600: variable 'omit' from source: magic vars 34006 1726882665.88157: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.88177: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.88573: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.88577: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.88579: when evaluation is False, skipping this task 34006 1726882665.88582: _execute() done 34006 1726882665.88584: dumping result to json 34006 1726882665.88586: done dumping result, returning 34006 1726882665.88591: done running TaskExecutor() for managed_node3/TASK: Include the task 'cleanup_mock_wifi.yml' [12673a56-9f93-11ce-7734-000000000142] 34006 1726882665.88595: sending task result for task 12673a56-9f93-11ce-7734-000000000142 34006 1726882665.88666: done sending task result for task 12673a56-9f93-11ce-7734-000000000142 34006 1726882665.88669: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.88726: no more pending results, returning what we have 34006 1726882665.88730: results queue empty 34006 1726882665.88731: checking for any_errors_fatal 34006 1726882665.88733: done checking for any_errors_fatal 34006 1726882665.88733: checking for max_fail_percentage 34006 1726882665.88735: done checking for max_fail_percentage 34006 1726882665.88735: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.88736: done checking to see if all hosts have failed 34006 1726882665.88737: getting the remaining hosts for this loop 34006 1726882665.88738: done getting the remaining hosts for this loop 34006 1726882665.88741: getting the next task for host managed_node3 34006 1726882665.88748: done getting next task for host managed_node3 34006 1726882665.88751: ^ task is: TASK: Verify network state restored to default 34006 1726882665.88754: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34006 1726882665.88757: getting variables 34006 1726882665.88758: in VariableManager get_vars() 34006 1726882665.88829: Calling all_inventory to load vars for managed_node3 34006 1726882665.88832: Calling groups_inventory to load vars for managed_node3 34006 1726882665.88834: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.88849: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.88852: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.88855: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.89117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.89605: done with get_vars() 34006 1726882665.89612: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:98 Friday 20 September 2024 21:37:45 -0400 (0:00:00.034) 0:00:05.673 ****** 34006 1726882665.89681: entering _queue_task() for managed_node3/include_tasks 34006 1726882665.89877: worker is 1 (out of 1 available) 34006 1726882665.89889: exiting _queue_task() for managed_node3/include_tasks 34006 1726882665.89901: done queuing things up, now waiting for results queue to drain 34006 1726882665.89902: waiting for pending results... 34006 1726882665.90069: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 34006 1726882665.90139: in run() - task 12673a56-9f93-11ce-7734-000000000143 34006 1726882665.90159: variable 'ansible_search_path' from source: unknown 34006 1726882665.90184: calling self._execute() 34006 1726882665.90251: variable 'ansible_host' from source: host vars for 'managed_node3' 34006 1726882665.90255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 34006 1726882665.90264: variable 'omit' from source: magic vars 34006 1726882665.90540: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.90550: Evaluated conditional (ansible_distribution_major_version != '6'): True 34006 1726882665.90630: variable 'ansible_distribution_major_version' from source: facts 34006 1726882665.90634: Evaluated conditional (ansible_distribution_major_version == '7'): False 34006 1726882665.90637: when evaluation is False, skipping this task 34006 1726882665.90640: _execute() done 34006 1726882665.90642: dumping result to json 34006 1726882665.90647: done dumping result, returning 34006 1726882665.90652: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [12673a56-9f93-11ce-7734-000000000143] 34006 1726882665.90657: sending task result for task 12673a56-9f93-11ce-7734-000000000143 34006 1726882665.90745: done sending task result for task 12673a56-9f93-11ce-7734-000000000143 34006 1726882665.90749: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34006 1726882665.90827: no more pending results, returning what we have 34006 1726882665.90832: results queue empty 34006 1726882665.90833: checking for any_errors_fatal 34006 1726882665.90844: done checking for any_errors_fatal 34006 1726882665.90845: checking for max_fail_percentage 34006 1726882665.90847: done checking for max_fail_percentage 34006 1726882665.90847: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.90848: done checking to see if all hosts have failed 34006 1726882665.90849: getting the remaining hosts for this loop 34006 1726882665.90850: done getting the remaining hosts for this loop 34006 1726882665.90854: getting the next task for host managed_node3 34006 1726882665.90863: done getting next task for host managed_node3 34006 1726882665.90865: ^ task is: TASK: meta (flush_handlers) 34006 1726882665.90866: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882665.90870: getting variables 34006 1726882665.90872: in VariableManager get_vars() 34006 1726882665.90908: Calling all_inventory to load vars for managed_node3 34006 1726882665.90910: Calling groups_inventory to load vars for managed_node3 34006 1726882665.90912: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.90921: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.90923: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.90926: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.91092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.91477: done with get_vars() 34006 1726882665.91486: done getting variables 34006 1726882665.91678: in VariableManager get_vars() 34006 1726882665.91700: Calling all_inventory to load vars for managed_node3 34006 1726882665.91703: Calling groups_inventory to load vars for managed_node3 34006 1726882665.91705: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.91710: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.91712: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.91715: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.91952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.92200: done with get_vars() 34006 1726882665.92212: done queuing things up, now waiting for results queue to drain 34006 1726882665.92213: results queue empty 34006 1726882665.92214: checking for any_errors_fatal 34006 1726882665.92216: done checking for any_errors_fatal 34006 1726882665.92218: checking for max_fail_percentage 34006 1726882665.92219: done checking for max_fail_percentage 34006 1726882665.92220: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.92221: done checking to see if all hosts have failed 34006 1726882665.92221: getting the remaining hosts for this loop 34006 1726882665.92222: done getting the remaining hosts for this loop 34006 1726882665.92224: getting the next task for host managed_node3 34006 1726882665.92227: done getting next task for host managed_node3 34006 1726882665.92229: ^ task is: TASK: meta (flush_handlers) 34006 1726882665.92230: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882665.92232: getting variables 34006 1726882665.92233: in VariableManager get_vars() 34006 1726882665.92248: Calling all_inventory to load vars for managed_node3 34006 1726882665.92250: Calling groups_inventory to load vars for managed_node3 34006 1726882665.92252: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.92256: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.92258: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.92260: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.92415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.92627: done with get_vars() 34006 1726882665.92636: done getting variables 34006 1726882665.92679: in VariableManager get_vars() 34006 1726882665.92700: Calling all_inventory to load vars for managed_node3 34006 1726882665.92702: Calling groups_inventory to load vars for managed_node3 34006 1726882665.92704: Calling all_plugins_inventory to load vars for managed_node3 34006 1726882665.92708: Calling all_plugins_play to load vars for managed_node3 34006 1726882665.92710: Calling groups_plugins_inventory to load vars for managed_node3 34006 1726882665.92713: Calling groups_plugins_play to load vars for managed_node3 34006 1726882665.92812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34006 1726882665.92977: done with get_vars() 34006 1726882665.92990: done queuing things up, now waiting for results queue to drain 34006 1726882665.92991: results queue empty 34006 1726882665.92992: checking for any_errors_fatal 34006 1726882665.92994: done checking for any_errors_fatal 34006 1726882665.92995: checking for max_fail_percentage 34006 1726882665.92995: done checking for max_fail_percentage 34006 1726882665.92996: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.92996: done checking to see if all hosts have failed 34006 1726882665.92997: getting the remaining hosts for this loop 34006 1726882665.92997: done getting the remaining hosts for this loop 34006 1726882665.93002: getting the next task for host managed_node3 34006 1726882665.93004: done getting next task for host managed_node3 34006 1726882665.93006: ^ task is: None 34006 1726882665.93008: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34006 1726882665.93009: done queuing things up, now waiting for results queue to drain 34006 1726882665.93010: results queue empty 34006 1726882665.93010: checking for any_errors_fatal 34006 1726882665.93011: done checking for any_errors_fatal 34006 1726882665.93012: checking for max_fail_percentage 34006 1726882665.93013: done checking for max_fail_percentage 34006 1726882665.93013: checking to see if all hosts have failed and the running result is not ok 34006 1726882665.93014: done checking to see if all hosts have failed 34006 1726882665.93015: getting the next task for host managed_node3 34006 1726882665.93018: done getting next task for host managed_node3 34006 1726882665.93018: ^ task is: None 34006 1726882665.93020: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=7 changed=0 unreachable=0 failed=0 skipped=102 rescued=0 ignored=0 Friday 20 September 2024 21:37:45 -0400 (0:00:00.033) 0:00:05.707 ****** =============================================================================== Gathering Facts --------------------------------------------------------- 1.60s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:6 Gather the minimum subset of ansible_facts required by the network role test --- 0.74s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.54s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Copy client certs ------------------------------------------------------- 0.08s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:13 fedora.linux_system_roles.network : Show stderr messages for the network_connections --- 0.06s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 fedora.linux_system_roles.network : Show debug messages for the network_connections --- 0.05s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Set network provider to 'nm' -------------------------------------------- 0.05s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:13 fedora.linux_system_roles.network : Show debug messages for the network_state --- 0.05s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Include the task 'enable_epel.yml' -------------------------------------- 0.05s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider --- 0.05s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Include the task 'el_repo_setup.yml' ------------------------------------ 0.04s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:11 fedora.linux_system_roles.network : Ensure ansible_facts used by role --- 0.04s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 --- 0.04s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Include the task 'setup_mock_wifi.yml' ---------------------------------- 0.04s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:11 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.04s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces --- 0.04s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later --- 0.04s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 TEST: wireless connection with WPA-PSK ---------------------------------- 0.04s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:24 fedora.linux_system_roles.network : Print network provider -------------- 0.04s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 fedora.linux_system_roles.network : Enable network service -------------- 0.04s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 34006 1726882665.93103: RUNNING CLEANUP