[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 34296 1726855343.54512: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-ZzD executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 34296 1726855343.54812: Added group all to inventory 34296 1726855343.54813: Added group ungrouped to inventory 34296 1726855343.54816: Group all now contains ungrouped 34296 1726855343.54818: Examining possible inventory source: /tmp/network-Koj/inventory.yml 34296 1726855343.70065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 34296 1726855343.70131: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 34296 1726855343.70154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 34296 1726855343.70216: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 34296 1726855343.70294: Loaded config def from plugin (inventory/script) 34296 1726855343.70296: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 34296 1726855343.70336: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 34296 1726855343.70426: Loaded config def from plugin (inventory/yaml) 34296 1726855343.70428: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 34296 1726855343.70517: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 34296 1726855343.70943: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 34296 1726855343.70946: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 34296 1726855343.70950: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 34296 1726855343.70956: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 34296 1726855343.70960: Loading data from /tmp/network-Koj/inventory.yml 34296 1726855343.71031: /tmp/network-Koj/inventory.yml was not parsable by auto 34296 1726855343.71101: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 34296 1726855343.71140: Loading data from /tmp/network-Koj/inventory.yml 34296 1726855343.71226: group all already in inventory 34296 1726855343.71233: set inventory_file for managed_node1 34296 1726855343.71237: set inventory_dir for managed_node1 34296 1726855343.71238: Added host managed_node1 to inventory 34296 1726855343.71240: Added host managed_node1 to group all 34296 1726855343.71241: set ansible_host for managed_node1 34296 1726855343.71242: set ansible_ssh_extra_args for managed_node1 34296 1726855343.71245: set inventory_file for managed_node2 34296 1726855343.71247: set inventory_dir for managed_node2 34296 1726855343.71248: Added host managed_node2 to inventory 34296 1726855343.71249: Added host managed_node2 to group all 34296 1726855343.71250: set ansible_host for managed_node2 34296 1726855343.71251: set ansible_ssh_extra_args for managed_node2 34296 1726855343.71253: set inventory_file for managed_node3 34296 1726855343.71255: set inventory_dir for managed_node3 34296 1726855343.71256: Added host managed_node3 to inventory 34296 1726855343.71257: Added host managed_node3 to group all 34296 1726855343.71258: set ansible_host for managed_node3 34296 1726855343.71259: set ansible_ssh_extra_args for managed_node3 34296 1726855343.71261: Reconcile groups and hosts in inventory. 34296 1726855343.71264: Group ungrouped now contains managed_node1 34296 1726855343.71269: Group ungrouped now contains managed_node2 34296 1726855343.71271: Group ungrouped now contains managed_node3 34296 1726855343.71345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 34296 1726855343.71471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 34296 1726855343.71521: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 34296 1726855343.71548: Loaded config def from plugin (vars/host_group_vars) 34296 1726855343.71550: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 34296 1726855343.71557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 34296 1726855343.71565: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 34296 1726855343.71611: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 34296 1726855343.71948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855343.72040: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 34296 1726855343.72076: Loaded config def from plugin (connection/local) 34296 1726855343.72079: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 34296 1726855343.72762: Loaded config def from plugin (connection/paramiko_ssh) 34296 1726855343.72769: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 34296 1726855343.73705: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34296 1726855343.73745: Loaded config def from plugin (connection/psrp) 34296 1726855343.73748: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 34296 1726855343.74480: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34296 1726855343.74521: Loaded config def from plugin (connection/ssh) 34296 1726855343.74524: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 34296 1726855343.76512: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34296 1726855343.76552: Loaded config def from plugin (connection/winrm) 34296 1726855343.76555: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 34296 1726855343.76593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 34296 1726855343.76659: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 34296 1726855343.76732: Loaded config def from plugin (shell/cmd) 34296 1726855343.76734: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 34296 1726855343.76762: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 34296 1726855343.76833: Loaded config def from plugin (shell/powershell) 34296 1726855343.76835: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 34296 1726855343.76892: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 34296 1726855343.77073: Loaded config def from plugin (shell/sh) 34296 1726855343.77075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 34296 1726855343.77112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 34296 1726855343.77234: Loaded config def from plugin (become/runas) 34296 1726855343.77236: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 34296 1726855343.77422: Loaded config def from plugin (become/su) 34296 1726855343.77425: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 34296 1726855343.77588: Loaded config def from plugin (become/sudo) 34296 1726855343.77591: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 34296 1726855343.77623: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml 34296 1726855343.77958: in VariableManager get_vars() 34296 1726855343.77983: done with get_vars() 34296 1726855343.78114: trying /usr/local/lib/python3.12/site-packages/ansible/modules 34296 1726855343.81093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 34296 1726855343.81204: in VariableManager get_vars() 34296 1726855343.81209: done with get_vars() 34296 1726855343.81212: variable 'playbook_dir' from source: magic vars 34296 1726855343.81213: variable 'ansible_playbook_python' from source: magic vars 34296 1726855343.81214: variable 'ansible_config_file' from source: magic vars 34296 1726855343.81214: variable 'groups' from source: magic vars 34296 1726855343.81215: variable 'omit' from source: magic vars 34296 1726855343.81216: variable 'ansible_version' from source: magic vars 34296 1726855343.81217: variable 'ansible_check_mode' from source: magic vars 34296 1726855343.81217: variable 'ansible_diff_mode' from source: magic vars 34296 1726855343.81218: variable 'ansible_forks' from source: magic vars 34296 1726855343.81219: variable 'ansible_inventory_sources' from source: magic vars 34296 1726855343.81219: variable 'ansible_skip_tags' from source: magic vars 34296 1726855343.81220: variable 'ansible_limit' from source: magic vars 34296 1726855343.81221: variable 'ansible_run_tags' from source: magic vars 34296 1726855343.81222: variable 'ansible_verbosity' from source: magic vars 34296 1726855343.81256: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml 34296 1726855343.81847: in VariableManager get_vars() 34296 1726855343.81864: done with get_vars() 34296 1726855343.81993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 34296 1726855343.82184: in VariableManager get_vars() 34296 1726855343.82199: done with get_vars() 34296 1726855343.82204: variable 'omit' from source: magic vars 34296 1726855343.82222: variable 'omit' from source: magic vars 34296 1726855343.82255: in VariableManager get_vars() 34296 1726855343.82265: done with get_vars() 34296 1726855343.82314: in VariableManager get_vars() 34296 1726855343.82327: done with get_vars() 34296 1726855343.82360: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34296 1726855343.82583: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34296 1726855343.82720: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34296 1726855343.83377: in VariableManager get_vars() 34296 1726855343.83399: done with get_vars() 34296 1726855343.83805: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 34296 1726855343.83948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34296 1726855343.85560: in VariableManager get_vars() 34296 1726855343.85585: done with get_vars() 34296 1726855343.85592: variable 'omit' from source: magic vars 34296 1726855343.85605: variable 'omit' from source: magic vars 34296 1726855343.85643: in VariableManager get_vars() 34296 1726855343.85679: done with get_vars() 34296 1726855343.85703: in VariableManager get_vars() 34296 1726855343.85719: done with get_vars() 34296 1726855343.85749: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34296 1726855343.85868: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34296 1726855343.85945: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34296 1726855343.88281: in VariableManager get_vars() 34296 1726855343.88301: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34296 1726855343.89646: in VariableManager get_vars() 34296 1726855343.89671: done with get_vars() 34296 1726855343.89677: variable 'omit' from source: magic vars 34296 1726855343.89690: variable 'omit' from source: magic vars 34296 1726855343.89722: in VariableManager get_vars() 34296 1726855343.89739: done with get_vars() 34296 1726855343.89759: in VariableManager get_vars() 34296 1726855343.89779: done with get_vars() 34296 1726855343.89809: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34296 1726855343.89944: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34296 1726855343.90024: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34296 1726855343.90413: in VariableManager get_vars() 34296 1726855343.90438: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34296 1726855343.91801: in VariableManager get_vars() 34296 1726855343.91818: done with get_vars() 34296 1726855343.91821: variable 'omit' from source: magic vars 34296 1726855343.91836: variable 'omit' from source: magic vars 34296 1726855343.91861: in VariableManager get_vars() 34296 1726855343.91876: done with get_vars() 34296 1726855343.91891: in VariableManager get_vars() 34296 1726855343.91904: done with get_vars() 34296 1726855343.91922: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34296 1726855343.92005: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34296 1726855343.92060: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34296 1726855343.92285: in VariableManager get_vars() 34296 1726855343.92304: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34296 1726855343.94263: in VariableManager get_vars() 34296 1726855343.94298: done with get_vars() 34296 1726855343.94336: in VariableManager get_vars() 34296 1726855343.94359: done with get_vars() 34296 1726855343.94423: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 34296 1726855343.94453: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 34296 1726855343.94756: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 34296 1726855343.94934: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 34296 1726855343.94936: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-ZzD/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 34296 1726855343.94957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 34296 1726855343.94976: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 34296 1726855343.95097: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 34296 1726855343.95133: Loaded config def from plugin (callback/default) 34296 1726855343.95135: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34296 1726855343.96017: Loaded config def from plugin (callback/junit) 34296 1726855343.96019: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34296 1726855343.96060: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 34296 1726855343.96119: Loaded config def from plugin (callback/minimal) 34296 1726855343.96121: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34296 1726855343.96154: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34296 1726855343.96194: Loaded config def from plugin (callback/tree) 34296 1726855343.96196: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 34296 1726855343.96270: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 34296 1726855343.96271: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-ZzD/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_wireless_nm.yml ************************************************ 2 plays in /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml 34296 1726855343.96292: in VariableManager get_vars() 34296 1726855343.96302: done with get_vars() 34296 1726855343.96306: in VariableManager get_vars() 34296 1726855343.96313: done with get_vars() 34296 1726855343.96315: variable 'omit' from source: magic vars 34296 1726855343.96339: in VariableManager get_vars() 34296 1726855343.96348: done with get_vars() 34296 1726855343.96361: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_wireless.yml' with nm as provider] ********* 34296 1726855343.96843: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 34296 1726855343.96916: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 34296 1726855343.96947: getting the remaining hosts for this loop 34296 1726855343.96949: done getting the remaining hosts for this loop 34296 1726855343.96952: getting the next task for host managed_node1 34296 1726855343.96955: done getting next task for host managed_node1 34296 1726855343.96957: ^ task is: TASK: Gathering Facts 34296 1726855343.96958: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855343.96960: getting variables 34296 1726855343.96961: in VariableManager get_vars() 34296 1726855343.96970: Calling all_inventory to load vars for managed_node1 34296 1726855343.96973: Calling groups_inventory to load vars for managed_node1 34296 1726855343.96975: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855343.96985: Calling all_plugins_play to load vars for managed_node1 34296 1726855343.96998: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855343.97001: Calling groups_plugins_play to load vars for managed_node1 34296 1726855343.97031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855343.97091: done with get_vars() 34296 1726855343.97097: done getting variables 34296 1726855343.97170: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:6 Friday 20 September 2024 14:02:23 -0400 (0:00:00.009) 0:00:00.009 ****** 34296 1726855343.97192: entering _queue_task() for managed_node1/gather_facts 34296 1726855343.97193: Creating lock for gather_facts 34296 1726855343.97546: worker is 1 (out of 1 available) 34296 1726855343.97558: exiting _queue_task() for managed_node1/gather_facts 34296 1726855343.97570: done queuing things up, now waiting for results queue to drain 34296 1726855343.97572: waiting for pending results... 34296 1726855343.97882: running TaskExecutor() for managed_node1/TASK: Gathering Facts 34296 1726855343.97928: in run() - task 0affcc66-ac2b-a97a-1acc-000000000147 34296 1726855343.97937: variable 'ansible_search_path' from source: unknown 34296 1726855343.97969: calling self._execute() 34296 1726855343.98075: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855343.98079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855343.98081: variable 'omit' from source: magic vars 34296 1726855343.98218: variable 'omit' from source: magic vars 34296 1726855343.98252: variable 'omit' from source: magic vars 34296 1726855343.98309: variable 'omit' from source: magic vars 34296 1726855343.98398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34296 1726855343.98412: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34296 1726855343.98436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34296 1726855343.98457: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34296 1726855343.98475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34296 1726855343.98521: variable 'inventory_hostname' from source: host vars for 'managed_node1' 34296 1726855343.98615: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855343.98618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855343.98647: Set connection var ansible_shell_type to sh 34296 1726855343.98661: Set connection var ansible_shell_executable to /bin/sh 34296 1726855343.98671: Set connection var ansible_connection to ssh 34296 1726855343.98684: Set connection var ansible_timeout to 10 34296 1726855343.98695: Set connection var ansible_module_compression to ZIP_DEFLATED 34296 1726855343.98705: Set connection var ansible_pipelining to False 34296 1726855343.98744: variable 'ansible_shell_executable' from source: unknown 34296 1726855343.98765: variable 'ansible_connection' from source: unknown 34296 1726855343.98786: variable 'ansible_module_compression' from source: unknown 34296 1726855343.98809: variable 'ansible_shell_type' from source: unknown 34296 1726855343.98839: variable 'ansible_shell_executable' from source: unknown 34296 1726855343.98892: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855343.98896: variable 'ansible_pipelining' from source: unknown 34296 1726855343.98899: variable 'ansible_timeout' from source: unknown 34296 1726855343.98901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855343.99118: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34296 1726855343.99129: variable 'omit' from source: magic vars 34296 1726855343.99132: starting attempt loop 34296 1726855343.99135: running the handler 34296 1726855343.99149: variable 'ansible_facts' from source: unknown 34296 1726855343.99167: _low_level_execute_command(): starting 34296 1726855343.99176: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34296 1726855343.99706: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855343.99710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855343.99713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found <<< 34296 1726855343.99715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855343.99762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' <<< 34296 1726855343.99766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34296 1726855343.99768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855343.99837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34296 1726855344.01542: stdout chunk (state=3): >>>/root <<< 34296 1726855344.01632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34296 1726855344.01674: stderr chunk (state=3): >>><<< 34296 1726855344.01677: stdout chunk (state=3): >>><<< 34296 1726855344.01697: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34296 1726855344.01712: _low_level_execute_command(): starting 34296 1726855344.01764: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855344.0170276-34311-136721470171755 `" && echo ansible-tmp-1726855344.0170276-34311-136721470171755="` echo /root/.ansible/tmp/ansible-tmp-1726855344.0170276-34311-136721470171755 `" ) && sleep 0' 34296 1726855344.02165: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855344.02168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found <<< 34296 1726855344.02171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34296 1726855344.02173: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34296 1726855344.02183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855344.02228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' <<< 34296 1726855344.02231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855344.02305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34296 1726855344.04229: stdout chunk (state=3): >>>ansible-tmp-1726855344.0170276-34311-136721470171755=/root/.ansible/tmp/ansible-tmp-1726855344.0170276-34311-136721470171755 <<< 34296 1726855344.04390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34296 1726855344.04394: stdout chunk (state=3): >>><<< 34296 1726855344.04396: stderr chunk (state=3): >>><<< 34296 1726855344.04593: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855344.0170276-34311-136721470171755=/root/.ansible/tmp/ansible-tmp-1726855344.0170276-34311-136721470171755 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34296 1726855344.04596: variable 'ansible_module_compression' from source: unknown 34296 1726855344.04598: ANSIBALLZ: Using generic lock for ansible.legacy.setup 34296 1726855344.04600: ANSIBALLZ: Acquiring lock 34296 1726855344.04602: ANSIBALLZ: Lock acquired: 139782747428656 34296 1726855344.04604: ANSIBALLZ: Creating module 34296 1726855344.25596: ANSIBALLZ: Writing module into payload 34296 1726855344.25749: ANSIBALLZ: Writing module 34296 1726855344.25778: ANSIBALLZ: Renaming module 34296 1726855344.25791: ANSIBALLZ: Done creating module 34296 1726855344.25836: variable 'ansible_facts' from source: unknown 34296 1726855344.25848: variable 'inventory_hostname' from source: host vars for 'managed_node1' 34296 1726855344.25862: _low_level_execute_command(): starting 34296 1726855344.25872: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 34296 1726855344.26480: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34296 1726855344.26497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34296 1726855344.26512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855344.26527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34296 1726855344.26542: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 <<< 34296 1726855344.26605: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855344.26649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' <<< 34296 1726855344.26667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34296 1726855344.26691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855344.26781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34296 1726855344.28599: stdout chunk (state=3): >>>PLATFORM <<< 34296 1726855344.28807: stdout chunk (state=3): >>>Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 34296 1726855344.28881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34296 1726855344.28885: stdout chunk (state=3): >>><<< 34296 1726855344.28890: stderr chunk (state=3): >>><<< 34296 1726855344.28908: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34296 1726855344.28923 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 34296 1726855344.28978: _low_level_execute_command(): starting 34296 1726855344.29260: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 34296 1726855344.29328: Sending initial data 34296 1726855344.29332: Sent initial data (1181 bytes) 34296 1726855344.30331: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855344.30348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855344.30525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' <<< 34296 1726855344.30538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855344.30612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34296 1726855344.34120: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 34296 1726855344.34912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34296 1726855344.34948: stderr chunk (state=3): >>><<< 34296 1726855344.34982: stdout chunk (state=3): >>><<< 34296 1726855344.35194: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34296 1726855344.35197: variable 'ansible_facts' from source: unknown 34296 1726855344.35199: variable 'ansible_facts' from source: unknown 34296 1726855344.35202: variable 'ansible_module_compression' from source: unknown 34296 1726855344.35331: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-342968v_83h8s/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34296 1726855344.35364: variable 'ansible_facts' from source: unknown 34296 1726855344.35767: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855344.0170276-34311-136721470171755/AnsiballZ_setup.py 34296 1726855344.36085: Sending initial data 34296 1726855344.36091: Sent initial data (154 bytes) 34296 1726855344.37307: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855344.37462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' <<< 34296 1726855344.37526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855344.37620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34296 1726855344.39262: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34296 1726855344.39316: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34296 1726855344.39501: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-342968v_83h8s/tmpjvzirtwi /root/.ansible/tmp/ansible-tmp-1726855344.0170276-34311-136721470171755/AnsiballZ_setup.py <<< 34296 1726855344.39582: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855344.0170276-34311-136721470171755/AnsiballZ_setup.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-342968v_83h8s/tmpjvzirtwi" to remote "/root/.ansible/tmp/ansible-tmp-1726855344.0170276-34311-136721470171755/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855344.0170276-34311-136721470171755/AnsiballZ_setup.py" <<< 34296 1726855344.42903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34296 1726855344.42928: stderr chunk (state=3): >>><<< 34296 1726855344.43004: stdout chunk (state=3): >>><<< 34296 1726855344.43076: done transferring module to remote 34296 1726855344.43164: _low_level_execute_command(): starting 34296 1726855344.43178: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855344.0170276-34311-136721470171755/ /root/.ansible/tmp/ansible-tmp-1726855344.0170276-34311-136721470171755/AnsiballZ_setup.py && sleep 0' 34296 1726855344.44538: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855344.44542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855344.44544: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855344.44552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855344.44742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855344.45105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34296 1726855344.47013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34296 1726855344.47040: stderr chunk (state=3): >>><<< 34296 1726855344.47050: stdout chunk (state=3): >>><<< 34296 1726855344.47076: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34296 1726855344.47084: _low_level_execute_command(): starting 34296 1726855344.47101: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855344.0170276-34311-136721470171755/AnsiballZ_setup.py && sleep 0' 34296 1726855344.48431: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34296 1726855344.48435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34296 1726855344.48438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855344.48457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34296 1726855344.48477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 <<< 34296 1726855344.48679: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855344.48704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK <<< 34296 1726855344.48828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855344.48893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34296 1726855344.51241: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 34296 1726855344.51284: stdout chunk (state=3): >>>import 'posix' # <<< 34296 1726855344.51340: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 34296 1726855344.51395: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855344.51445: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 34296 1726855344.51479: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 34296 1726855344.51554: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643fbc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643f8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643fbea50> <<< 34296 1726855344.51790: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 34296 1726855344.51797: stdout chunk (state=3): >>>import 'os' # <<< 34296 1726855344.51813: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 34296 1726855344.51828: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 34296 1726855344.51893: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 34296 1726855344.51897: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 34296 1726855344.51982: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643fcd130> <<< 34296 1726855344.51998: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643fcdfa0> <<< 34296 1726855344.52007: stdout chunk (state=3): >>>import 'site' # <<< 34296 1726855344.52042: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34296 1726855344.52418: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 34296 1726855344.52545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 34296 1726855344.52563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 34296 1726855344.52596: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643dabda0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 34296 1726855344.52640: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643dabfe0> <<< 34296 1726855344.52658: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 34296 1726855344.52744: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 34296 1726855344.52747: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 34296 1726855344.52775: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 34296 1726855344.52811: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643de37a0> <<< 34296 1726855344.52826: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 34296 1726855344.52861: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643de3e30> import '_collections' # <<< 34296 1726855344.52910: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643dc3a70> <<< 34296 1726855344.52972: stdout chunk (state=3): >>>import '_functools' # <<< 34296 1726855344.52988: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643dc1190> <<< 34296 1726855344.53069: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643da8f50> <<< 34296 1726855344.53073: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 34296 1726855344.53315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e03710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e02330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643dc2060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643daa810> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e387a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643da81d0> <<< 34296 1726855344.53337: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 34296 1726855344.53371: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 34296 1726855344.53393: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643e38c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e38b00> <<< 34296 1726855344.53442: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643e38ec0> <<< 34296 1726855344.53692: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643da6cf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 34296 1726855344.53695: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e395b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e39280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e3a4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 34296 1726855344.53698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 34296 1726855344.53705: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e506e0> <<< 34296 1726855344.53756: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643e51df0> <<< 34296 1726855344.54018: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e52c60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643e532c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e521b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643e53d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e53470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e3a510> <<< 34296 1726855344.54030: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 34296 1726855344.54092: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 34296 1726855344.54095: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 34296 1726855344.54124: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643b5fb90> <<< 34296 1726855344.54146: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 34296 1726855344.54178: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 34296 1726855344.54199: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643b88620> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643b88380> <<< 34296 1726855344.54214: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643b88650> <<< 34296 1726855344.54472: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643b88f80> <<< 34296 1726855344.54818: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643b898b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643b88830> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643b5dd30> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643b8ac00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643b88e00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e3ac00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855344.54834: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 34296 1726855344.54916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643bb6f60> <<< 34296 1726855344.54993: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 34296 1726855344.54997: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855344.55003: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 34296 1726855344.55011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 34296 1726855344.55050: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643bdb320> <<< 34296 1726855344.55074: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 34296 1726855344.55193: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34296 1726855344.55217: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643c07f80> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 34296 1726855344.55250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 34296 1726855344.55565: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643c3a7e0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643c381a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643bdbfb0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56435291c0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643bda120> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643b8bb60> <<< 34296 1726855344.55868: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5643bda240> <<< 34296 1726855344.56030: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_z7kx84y4/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 34296 1726855344.56311: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 34296 1726855344.56328: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564358ae40> <<< 34296 1726855344.56340: stdout chunk (state=3): >>>import '_typing' # <<< 34296 1726855344.56529: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643569d30> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643568ef0> <<< 34296 1726855344.56532: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.56563: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 34296 1726855344.56631: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 34296 1726855344.58025: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.59322: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643588ce0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56435c2870> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56435c2600> <<< 34296 1726855344.59472: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56435c1f10> <<< 34296 1726855344.59476: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56435c2660> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564358b860> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56435c35c0> <<< 34296 1726855344.59496: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56435c3800> <<< 34296 1726855344.59509: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 34296 1726855344.59550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 34296 1726855344.59569: stdout chunk (state=3): >>>import '_locale' # <<< 34296 1726855344.59605: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56435c3d40> <<< 34296 1726855344.59622: stdout chunk (state=3): >>>import 'pwd' # <<< 34296 1726855344.59680: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 34296 1726855344.59705: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643425a00> <<< 34296 1726855344.59726: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643427680> <<< 34296 1726855344.59791: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 34296 1726855344.59865: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643427f80> <<< 34296 1726855344.59888: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564342cf80> <<< 34296 1726855344.59908: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 34296 1726855344.59925: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 34296 1726855344.59949: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 34296 1726855344.60005: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564342fce0> <<< 34296 1726855344.60041: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643bb6ed0> <<< 34296 1726855344.60060: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564342dfa0> <<< 34296 1726855344.60115: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 34296 1726855344.60301: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 34296 1726855344.60322: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643433c20> import '_tokenize' # <<< 34296 1726855344.60953: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643432720> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643432480> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56434329c0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564342e4b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643477e00> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643477800> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643479a30> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56434797f0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34296 1726855344.60957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 34296 1726855344.60959: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f564347bfb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564347a120> <<< 34296 1726855344.60962: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34296 1726855344.60964: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855344.61215: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564347f5f0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564347bf20> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643480740> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56434808c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643480950> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643478170> <<< 34296 1726855344.61260: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 34296 1726855344.61413: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643483fe0> <<< 34296 1726855344.61710: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f564330d250> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56434827e0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643483b90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643482420> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available <<< 34296 1726855344.61771: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.61812: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 34296 1726855344.61815: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.61818: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.61829: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 34296 1726855344.61842: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.62112: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34296 1726855344.62707: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.63162: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 34296 1726855344.63179: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 34296 1726855344.63202: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 34296 1726855344.63213: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855344.63272: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643311460> <<< 34296 1726855344.63361: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 34296 1726855344.63382: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643312240> <<< 34296 1726855344.63394: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56434804a0> <<< 34296 1726855344.63429: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 34296 1726855344.63446: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.63464: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.63582: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 34296 1726855344.63638: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.63797: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 34296 1726855344.63815: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56433122d0> <<< 34296 1726855344.63826: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.64285: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.64732: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.64802: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.65105: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 34296 1726855344.65131: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 34296 1726855344.65135: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.65149: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 34296 1726855344.65165: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.65208: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.65241: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 34296 1726855344.65451: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.65657: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.65714: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 34296 1726855344.65778: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 34296 1726855344.65791: stdout chunk (state=3): >>>import '_ast' # <<< 34296 1726855344.65974: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643313380> <<< 34296 1726855344.65978: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.65980: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.66016: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 34296 1726855344.66032: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 34296 1726855344.66051: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.66102: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.66138: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 34296 1726855344.66151: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.66592: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.66596: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855344.66601: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f564331dd30> <<< 34296 1726855344.66603: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643318d70> <<< 34296 1726855344.66605: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 34296 1726855344.66607: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 34296 1726855344.66645: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.66705: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.66728: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.66778: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855344.66799: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 34296 1726855344.66821: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 34296 1726855344.66941: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 34296 1726855344.66944: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 34296 1726855344.66946: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 34296 1726855344.66948: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 34296 1726855344.67048: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643406720> <<< 34296 1726855344.67057: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56434fe420> <<< 34296 1726855344.67275: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564331de20> <<< 34296 1726855344.67279: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643310f80> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 34296 1726855344.67282: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 34296 1726855344.67299: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.67373: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 34296 1726855344.67385: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.67437: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.67454: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.67479: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.67522: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.67664: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 34296 1726855344.67724: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.67794: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.67817: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.67861: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 34296 1726855344.67870: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.68158: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.68210: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.68252: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.68405: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855344.68429: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56433b1b20> <<< 34296 1726855344.68449: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 34296 1726855344.68471: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 34296 1726855344.68514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 34296 1726855344.68539: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 34296 1726855344.68555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 34296 1726855344.68840: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642f8bbc0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5642f8bf50> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56433b31a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56433b2690> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56433b01d0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56433b0bc0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 34296 1726855344.68843: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 34296 1726855344.68860: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 34296 1726855344.68875: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 34296 1726855344.68900: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5642fa2de0> <<< 34296 1726855344.68913: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642fa26c0> <<< 34296 1726855344.68950: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5642fa2870> <<< 34296 1726855344.68957: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642fa1af0> <<< 34296 1726855344.68975: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 34296 1726855344.69238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 34296 1726855344.69242: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642fa2f00> <<< 34296 1726855344.69245: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 34296 1726855344.69247: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 34296 1726855344.69249: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5642ff99d0> <<< 34296 1726855344.69251: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642fa39b0> <<< 34296 1726855344.69253: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56433b1820> import 'ansible.module_utils.facts.timeout' # <<< 34296 1726855344.69374: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 34296 1726855344.69384: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available <<< 34296 1726855344.69426: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 34296 1726855344.69449: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.69500: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.69548: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 34296 1726855344.69564: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.69793: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 34296 1726855344.69797: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available <<< 34296 1726855344.69799: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 34296 1726855344.69801: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.69813: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.69861: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 34296 1726855344.69873: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.70093: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34296 1726855344.70096: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.70117: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 34296 1726855344.70121: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.70601: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.71111: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 34296 1726855344.71114: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.71117: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.71150: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.71189: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.71337: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 34296 1726855344.71358: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.71411: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 34296 1726855344.71489: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34296 1726855344.71507: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 34296 1726855344.71656: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available <<< 34296 1726855344.71742: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 34296 1726855344.71764: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 34296 1726855344.71778: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642ffbe90> <<< 34296 1726855344.71796: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 34296 1726855344.71819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 34296 1726855344.71939: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642ffa4b0> <<< 34296 1726855344.71955: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 34296 1726855344.72023: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.72098: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 34296 1726855344.72190: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.72318: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 34296 1726855344.72351: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.72512: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available <<< 34296 1726855344.72528: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 34296 1726855344.72650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 34296 1726855344.72704: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643039be0> <<< 34296 1726855344.72895: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564301d910> import 'ansible.module_utils.facts.system.python' # <<< 34296 1726855344.72907: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.72976: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.73018: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 34296 1726855344.73108: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.73193: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.73443: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.73446: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 34296 1726855344.73503: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.73539: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.73542: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 34296 1726855344.73591: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.73629: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.73721: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 34296 1726855344.73936: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643041970> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643041670> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 34296 1726855344.73998: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.74259: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available <<< 34296 1726855344.74354: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.74402: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.74448: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 34296 1726855344.74459: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.74482: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.74498: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.74788: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 34296 1726855344.74803: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.74921: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.75040: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 34296 1726855344.75136: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34296 1726855344.75139: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.75714: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.76200: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 34296 1726855344.76214: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.76317: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.76437: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 34296 1726855344.76500: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.76653: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.76657: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 34296 1726855344.76659: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.76792: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.76949: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 34296 1726855344.76974: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.76998: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 34296 1726855344.77103: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 34296 1726855344.77193: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.77290: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.77490: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.77723: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 34296 1726855344.77910: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.77913: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 34296 1726855344.78205: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 34296 1726855344.78310: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # <<< 34296 1726855344.78313: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.78603: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.79019: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available <<< 34296 1726855344.79042: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 34296 1726855344.79057: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.79080: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.79130: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 34296 1726855344.79157: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.79229: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 34296 1726855344.79312: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.79460: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 34296 1726855344.79464: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.79497: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 34296 1726855344.79534: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.79631: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.79645: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34296 1726855344.79707: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.79862: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 34296 1726855344.79895: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 34296 1726855344.79908: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.80112: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.80293: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 34296 1726855344.80306: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.80369: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.80407: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 34296 1726855344.80520: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.80535: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 34296 1726855344.80624: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.80662: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 34296 1726855344.80693: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.80843: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.80865: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 34296 1726855344.80944: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855344.81397: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 34296 1726855344.81426: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 34296 1726855344.81499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5642dd7500> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642dd42f0> <<< 34296 1726855344.81542: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642dd6ea0> <<< 34296 1726855344.97100: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 34296 1726855344.97190: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642e1d310> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 34296 1726855344.97194: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 34296 1726855344.97270: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642e1ef90> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855344.97320: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 34296 1726855344.97323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642e684d0> <<< 34296 1726855344.97377: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642e6b350> <<< 34296 1726855344.97594: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 34296 1726855345.17808: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "14", "minute": "02", "second": "24", "epoch": "1726855344", "epoch_int": "1726855344", "date": "2024-09-20", "time": "14:02:24", "iso8601_micro": "2024-09-20T18:02:24.813786Z", "iso8601": "2024-09-20T18:02:24Z", "iso8601_basic": "20240920T140224813786", "iso8601_basic_short": "20240920T140224", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-44.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-44", "ansible_nodename": "ip-10-31-10-44.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec256a8385d0e1834efdcee0241eb89f", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDHUYt5wfRkUoXhr+uEExQ+HhovkYI5uoocZphrjWB9VMTYAPkmr8CuX+VnCXEjIcFkU4gKaqEurk7K1oCCCvMVY+8Lx5A0D1tOL7nuQ3qS/VV7QoUpqykECy83nEA8/4uH5k1zBHFre8BVrJ2+ajOt+yFNJ09xVXQYHFaYl2Fb5yrh0VPKS0HbLIX/421Zj6H4fXZh2hVLH3p7ZnvXxgY+fNXOwkbFXFpljGMQuXNjflqfOabJUQ520FAteLnBbrkmG2hLRBKB+xMKVgsWad5mNepxZioFTgRyVzX0ykevriDqvEDKfAw+FLZSyT09gS4HrWGkyEOfKtmH24gWrtXrEEoTaNCRtSBTo2YVliQabO4JiMyDbKv/l842RgOZVW2ZYRhJt9vJlQH9vqiJec+B8CFp2wWmrk86VMrbbGpCUV3kumTaD569IJ8oPsDPdTkLjmfkbkbrD8F6DHXNDMfaKTNhwGrb7az2ZI4oXtuRV5Nj/asaepJFulEPiE8MsSM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLkB2rX0xX4F/DzMEydgf7lYw3g8CdOq7oErXDwnQFxylKGWmIu8Cb6CiKkdKvj7f3AU3Wj6OPQuHXkxsl5hV9I=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMNckAa3R2XGvAGYYtN9sXs3nJFV9bRzT0omObiULH2l", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_pars<<< 34296 1726855345.17899: stdout chunk (state=3): >>>ed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.349609375, "5m": 0.4296875, "15m": 0.28759765625}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 40014 10.31.10.44 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 40014 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:b4:71:11:91:97", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.44", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10b4:71ff:fe11:9197", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.44", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:b4:71:11:91:97", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.44"], "ansible_all_ipv6_addresses": ["fe80::10b4:71ff:fe11:9197"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.44", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10b4:71ff:fe11:9197"]}, "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2951, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 580, "free": 2951}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec256a83-85d0-e183-4efd-cee0241eb89f", "ansible_product_uuid": "ec256a83-85d0-e183-4efd-cee0241eb89f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1112, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261790130176, "block_size": 4096, "block_total": 65519099, "block_available": 63913606, "block_used": 1605493, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34296 1726855345.18501: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins <<< 34296 1726855345.18571: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword <<< 34296 1726855345.18598: stdout chunk (state=3): >>># cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible <<< 34296 1726855345.18628: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string <<< 34296 1726855345.18686: stdout chunk (state=3): >>># cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation <<< 34296 1726855345.18716: stdout chunk (state=3): >>># destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 34296 1726855345.18764: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb <<< 34296 1726855345.18815: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd <<< 34296 1726855345.18829: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 34296 1726855345.19170: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 34296 1726855345.19223: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 34296 1726855345.19248: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 34296 1726855345.19260: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 34296 1726855345.19311: stdout chunk (state=3): >>># destroy ntpath <<< 34296 1726855345.19344: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 34296 1726855345.19371: stdout chunk (state=3): >>># destroy grp # destroy encodings # destroy _locale <<< 34296 1726855345.19394: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 34296 1726855345.19405: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 34296 1726855345.19439: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 34296 1726855345.19464: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 34296 1726855345.19504: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors <<< 34296 1726855345.19540: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 34296 1726855345.19577: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors <<< 34296 1726855345.19600: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 34296 1726855345.19633: stdout chunk (state=3): >>># destroy _ssl <<< 34296 1726855345.19653: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd <<< 34296 1726855345.19695: stdout chunk (state=3): >>># destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch <<< 34296 1726855345.19715: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing <<< 34296 1726855345.19727: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection <<< 34296 1726855345.19775: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 34296 1726855345.19802: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 34296 1726855345.19850: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref <<< 34296 1726855345.19905: stdout chunk (state=3): >>># cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 34296 1726855345.19908: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 34296 1726855345.19939: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools <<< 34296 1726855345.20015: stdout chunk (state=3): >>># cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs <<< 34296 1726855345.20039: stdout chunk (state=3): >>># cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 34296 1726855345.20061: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34296 1726855345.20204: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 34296 1726855345.20253: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize <<< 34296 1726855345.20306: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34296 1726855345.20622: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 34296 1726855345.20933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.44 closed. <<< 34296 1726855345.20941: stdout chunk (state=3): >>><<< 34296 1726855345.20943: stderr chunk (state=3): >>><<< 34296 1726855345.21204: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643fbc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643f8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643fbea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643fcd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643fcdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643dabda0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643dabfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643de37a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643de3e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643dc3a70> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643dc1190> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643da8f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e03710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e02330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643dc2060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643daa810> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e387a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643da81d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643e38c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e38b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643e38ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643da6cf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e395b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e39280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e3a4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e506e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643e51df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e52c60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643e532c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e521b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643e53d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e53470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e3a510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643b5fb90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643b88620> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643b88380> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643b88650> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643b88f80> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643b898b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643b88830> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643b5dd30> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643b8ac00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643b88e00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643e3ac00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643bb6f60> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643bdb320> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643c07f80> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643c3a7e0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643c381a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643bdbfb0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56435291c0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643bda120> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643b8bb60> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5643bda240> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_z7kx84y4/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564358ae40> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643569d30> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643568ef0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643588ce0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56435c2870> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56435c2600> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56435c1f10> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56435c2660> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564358b860> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56435c35c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56435c3800> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56435c3d40> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643425a00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643427680> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643427f80> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564342cf80> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564342fce0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643bb6ed0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564342dfa0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643433c20> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643432720> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643432480> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56434329c0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564342e4b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643477e00> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643477800> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643479a30> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56434797f0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f564347bfb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564347a120> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564347f5f0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564347bf20> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643480740> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f56434808c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643480950> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643478170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643483fe0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f564330d250> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56434827e0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643483b90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643482420> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643311460> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643312240> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56434804a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56433122d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643313380> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f564331dd30> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643318d70> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643406720> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56434fe420> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564331de20> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643310f80> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56433b1b20> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642f8bbc0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5642f8bf50> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56433b31a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56433b2690> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56433b01d0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56433b0bc0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5642fa2de0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642fa26c0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5642fa2870> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642fa1af0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642fa2f00> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5642ff99d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642fa39b0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f56433b1820> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642ffbe90> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642ffa4b0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643039be0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f564301d910> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5643041970> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5643041670> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5642dd7500> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642dd42f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642dd6ea0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642e1d310> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642e1ef90> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642e684d0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5642e6b350> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "14", "minute": "02", "second": "24", "epoch": "1726855344", "epoch_int": "1726855344", "date": "2024-09-20", "time": "14:02:24", "iso8601_micro": "2024-09-20T18:02:24.813786Z", "iso8601": "2024-09-20T18:02:24Z", "iso8601_basic": "20240920T140224813786", "iso8601_basic_short": "20240920T140224", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-44.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-44", "ansible_nodename": "ip-10-31-10-44.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec256a8385d0e1834efdcee0241eb89f", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDHUYt5wfRkUoXhr+uEExQ+HhovkYI5uoocZphrjWB9VMTYAPkmr8CuX+VnCXEjIcFkU4gKaqEurk7K1oCCCvMVY+8Lx5A0D1tOL7nuQ3qS/VV7QoUpqykECy83nEA8/4uH5k1zBHFre8BVrJ2+ajOt+yFNJ09xVXQYHFaYl2Fb5yrh0VPKS0HbLIX/421Zj6H4fXZh2hVLH3p7ZnvXxgY+fNXOwkbFXFpljGMQuXNjflqfOabJUQ520FAteLnBbrkmG2hLRBKB+xMKVgsWad5mNepxZioFTgRyVzX0ykevriDqvEDKfAw+FLZSyT09gS4HrWGkyEOfKtmH24gWrtXrEEoTaNCRtSBTo2YVliQabO4JiMyDbKv/l842RgOZVW2ZYRhJt9vJlQH9vqiJec+B8CFp2wWmrk86VMrbbGpCUV3kumTaD569IJ8oPsDPdTkLjmfkbkbrD8F6DHXNDMfaKTNhwGrb7az2ZI4oXtuRV5Nj/asaepJFulEPiE8MsSM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLkB2rX0xX4F/DzMEydgf7lYw3g8CdOq7oErXDwnQFxylKGWmIu8Cb6CiKkdKvj7f3AU3Wj6OPQuHXkxsl5hV9I=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMNckAa3R2XGvAGYYtN9sXs3nJFV9bRzT0omObiULH2l", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.349609375, "5m": 0.4296875, "15m": 0.28759765625}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 40014 10.31.10.44 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 40014 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:b4:71:11:91:97", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.44", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10b4:71ff:fe11:9197", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.44", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:b4:71:11:91:97", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.44"], "ansible_all_ipv6_addresses": ["fe80::10b4:71ff:fe11:9197"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.44", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10b4:71ff:fe11:9197"]}, "ansible_fips": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2951, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 580, "free": 2951}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec256a83-85d0-e183-4efd-cee0241eb89f", "ansible_product_uuid": "ec256a83-85d0-e183-4efd-cee0241eb89f", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1112, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261790130176, "block_size": 4096, "block_total": 65519099, "block_available": 63913606, "block_used": 1605493, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.44 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 34296 1726855345.23392: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855344.0170276-34311-136721470171755/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34296 1726855345.23395: _low_level_execute_command(): starting 34296 1726855345.23405: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855344.0170276-34311-136721470171755/ > /dev/null 2>&1 && sleep 0' 34296 1726855345.23641: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34296 1726855345.23741: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855345.23786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' <<< 34296 1726855345.23814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34296 1726855345.23860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855345.23966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34296 1726855345.26304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34296 1726855345.26309: stdout chunk (state=3): >>><<< 34296 1726855345.26311: stderr chunk (state=3): >>><<< 34296 1726855345.26314: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34296 1726855345.26316: handler run complete 34296 1726855345.26318: variable 'ansible_facts' from source: unknown 34296 1726855345.26448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855345.26817: variable 'ansible_facts' from source: unknown 34296 1726855345.26917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855345.27077: attempt loop complete, returning result 34296 1726855345.27086: _execute() done 34296 1726855345.27096: dumping result to json 34296 1726855345.27128: done dumping result, returning 34296 1726855345.27139: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affcc66-ac2b-a97a-1acc-000000000147] 34296 1726855345.27147: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000147 ok: [managed_node1] 34296 1726855345.27998: no more pending results, returning what we have 34296 1726855345.28001: results queue empty 34296 1726855345.28002: checking for any_errors_fatal 34296 1726855345.28117: done checking for any_errors_fatal 34296 1726855345.28118: checking for max_fail_percentage 34296 1726855345.28120: done checking for max_fail_percentage 34296 1726855345.28121: checking to see if all hosts have failed and the running result is not ok 34296 1726855345.28122: done checking to see if all hosts have failed 34296 1726855345.28123: getting the remaining hosts for this loop 34296 1726855345.28124: done getting the remaining hosts for this loop 34296 1726855345.28128: getting the next task for host managed_node1 34296 1726855345.28135: done getting next task for host managed_node1 34296 1726855345.28136: ^ task is: TASK: meta (flush_handlers) 34296 1726855345.28138: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855345.28149: getting variables 34296 1726855345.28150: in VariableManager get_vars() 34296 1726855345.28173: Calling all_inventory to load vars for managed_node1 34296 1726855345.28176: Calling groups_inventory to load vars for managed_node1 34296 1726855345.28179: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855345.28228: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000147 34296 1726855345.28231: WORKER PROCESS EXITING 34296 1726855345.28240: Calling all_plugins_play to load vars for managed_node1 34296 1726855345.28243: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855345.28246: Calling groups_plugins_play to load vars for managed_node1 34296 1726855345.28524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855345.28734: done with get_vars() 34296 1726855345.28745: done getting variables 34296 1726855345.28821: in VariableManager get_vars() 34296 1726855345.28829: Calling all_inventory to load vars for managed_node1 34296 1726855345.28831: Calling groups_inventory to load vars for managed_node1 34296 1726855345.28833: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855345.28837: Calling all_plugins_play to load vars for managed_node1 34296 1726855345.28840: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855345.28842: Calling groups_plugins_play to load vars for managed_node1 34296 1726855345.28994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855345.29216: done with get_vars() 34296 1726855345.29235: done queuing things up, now waiting for results queue to drain 34296 1726855345.29237: results queue empty 34296 1726855345.29238: checking for any_errors_fatal 34296 1726855345.29240: done checking for any_errors_fatal 34296 1726855345.29241: checking for max_fail_percentage 34296 1726855345.29242: done checking for max_fail_percentage 34296 1726855345.29243: checking to see if all hosts have failed and the running result is not ok 34296 1726855345.29248: done checking to see if all hosts have failed 34296 1726855345.29249: getting the remaining hosts for this loop 34296 1726855345.29250: done getting the remaining hosts for this loop 34296 1726855345.29253: getting the next task for host managed_node1 34296 1726855345.29257: done getting next task for host managed_node1 34296 1726855345.29260: ^ task is: TASK: Include the task 'el_repo_setup.yml' 34296 1726855345.29261: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855345.29263: getting variables 34296 1726855345.29264: in VariableManager get_vars() 34296 1726855345.29272: Calling all_inventory to load vars for managed_node1 34296 1726855345.29274: Calling groups_inventory to load vars for managed_node1 34296 1726855345.29276: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855345.29280: Calling all_plugins_play to load vars for managed_node1 34296 1726855345.29282: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855345.29285: Calling groups_plugins_play to load vars for managed_node1 34296 1726855345.29441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855345.29640: done with get_vars() 34296 1726855345.29648: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:11 Friday 20 September 2024 14:02:25 -0400 (0:00:01.325) 0:00:01.334 ****** 34296 1726855345.29727: entering _queue_task() for managed_node1/include_tasks 34296 1726855345.29729: Creating lock for include_tasks 34296 1726855345.30179: worker is 1 (out of 1 available) 34296 1726855345.30191: exiting _queue_task() for managed_node1/include_tasks 34296 1726855345.30203: done queuing things up, now waiting for results queue to drain 34296 1726855345.30205: waiting for pending results... 34296 1726855345.30345: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 34296 1726855345.30450: in run() - task 0affcc66-ac2b-a97a-1acc-000000000006 34296 1726855345.30472: variable 'ansible_search_path' from source: unknown 34296 1726855345.30515: calling self._execute() 34296 1726855345.30592: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855345.30609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855345.30624: variable 'omit' from source: magic vars 34296 1726855345.30743: _execute() done 34296 1726855345.30755: dumping result to json 34296 1726855345.30826: done dumping result, returning 34296 1726855345.30830: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0affcc66-ac2b-a97a-1acc-000000000006] 34296 1726855345.30832: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000006 34296 1726855345.30908: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000006 34296 1726855345.30911: WORKER PROCESS EXITING 34296 1726855345.30959: no more pending results, returning what we have 34296 1726855345.30968: in VariableManager get_vars() 34296 1726855345.31002: Calling all_inventory to load vars for managed_node1 34296 1726855345.31005: Calling groups_inventory to load vars for managed_node1 34296 1726855345.31009: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855345.31020: Calling all_plugins_play to load vars for managed_node1 34296 1726855345.31023: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855345.31026: Calling groups_plugins_play to load vars for managed_node1 34296 1726855345.31475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855345.31675: done with get_vars() 34296 1726855345.31682: variable 'ansible_search_path' from source: unknown 34296 1726855345.31772: we have included files to process 34296 1726855345.31773: generating all_blocks data 34296 1726855345.31775: done generating all_blocks data 34296 1726855345.31776: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34296 1726855345.31777: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34296 1726855345.31779: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34296 1726855345.32667: in VariableManager get_vars() 34296 1726855345.32683: done with get_vars() 34296 1726855345.32699: done processing included file 34296 1726855345.32702: iterating over new_blocks loaded from include file 34296 1726855345.32704: in VariableManager get_vars() 34296 1726855345.32713: done with get_vars() 34296 1726855345.32715: filtering new block on tags 34296 1726855345.32740: done filtering new block on tags 34296 1726855345.32744: in VariableManager get_vars() 34296 1726855345.32754: done with get_vars() 34296 1726855345.32756: filtering new block on tags 34296 1726855345.32772: done filtering new block on tags 34296 1726855345.32774: in VariableManager get_vars() 34296 1726855345.32785: done with get_vars() 34296 1726855345.32786: filtering new block on tags 34296 1726855345.32803: done filtering new block on tags 34296 1726855345.32805: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 34296 1726855345.32810: extending task lists for all hosts with included blocks 34296 1726855345.32865: done extending task lists 34296 1726855345.32866: done processing included files 34296 1726855345.32867: results queue empty 34296 1726855345.32867: checking for any_errors_fatal 34296 1726855345.32869: done checking for any_errors_fatal 34296 1726855345.32869: checking for max_fail_percentage 34296 1726855345.32870: done checking for max_fail_percentage 34296 1726855345.32871: checking to see if all hosts have failed and the running result is not ok 34296 1726855345.32872: done checking to see if all hosts have failed 34296 1726855345.32872: getting the remaining hosts for this loop 34296 1726855345.32873: done getting the remaining hosts for this loop 34296 1726855345.32876: getting the next task for host managed_node1 34296 1726855345.32879: done getting next task for host managed_node1 34296 1726855345.32881: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 34296 1726855345.32884: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855345.32886: getting variables 34296 1726855345.32890: in VariableManager get_vars() 34296 1726855345.32899: Calling all_inventory to load vars for managed_node1 34296 1726855345.32901: Calling groups_inventory to load vars for managed_node1 34296 1726855345.32903: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855345.32909: Calling all_plugins_play to load vars for managed_node1 34296 1726855345.32911: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855345.32914: Calling groups_plugins_play to load vars for managed_node1 34296 1726855345.33092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855345.33295: done with get_vars() 34296 1726855345.33304: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 14:02:25 -0400 (0:00:00.036) 0:00:01.371 ****** 34296 1726855345.33365: entering _queue_task() for managed_node1/setup 34296 1726855345.33735: worker is 1 (out of 1 available) 34296 1726855345.33746: exiting _queue_task() for managed_node1/setup 34296 1726855345.33756: done queuing things up, now waiting for results queue to drain 34296 1726855345.33757: waiting for pending results... 34296 1726855345.34103: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 34296 1726855345.34109: in run() - task 0affcc66-ac2b-a97a-1acc-000000000158 34296 1726855345.34111: variable 'ansible_search_path' from source: unknown 34296 1726855345.34114: variable 'ansible_search_path' from source: unknown 34296 1726855345.34162: calling self._execute() 34296 1726855345.34335: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855345.34367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855345.34473: variable 'omit' from source: magic vars 34296 1726855345.35395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34296 1726855345.39662: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34296 1726855345.39944: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34296 1726855345.39956: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34296 1726855345.39999: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34296 1726855345.40078: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34296 1726855345.40243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34296 1726855345.40400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34296 1726855345.40433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34296 1726855345.40478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34296 1726855345.40566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34296 1726855345.40921: variable 'ansible_facts' from source: unknown 34296 1726855345.40992: variable 'network_test_required_facts' from source: task vars 34296 1726855345.41041: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 34296 1726855345.41052: variable 'omit' from source: magic vars 34296 1726855345.41101: variable 'omit' from source: magic vars 34296 1726855345.41148: variable 'omit' from source: magic vars 34296 1726855345.41182: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34296 1726855345.41250: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34296 1726855345.41256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34296 1726855345.41266: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34296 1726855345.41284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34296 1726855345.41321: variable 'inventory_hostname' from source: host vars for 'managed_node1' 34296 1726855345.41329: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855345.41355: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855345.41445: Set connection var ansible_shell_type to sh 34296 1726855345.41463: Set connection var ansible_shell_executable to /bin/sh 34296 1726855345.41493: Set connection var ansible_connection to ssh 34296 1726855345.41496: Set connection var ansible_timeout to 10 34296 1726855345.41498: Set connection var ansible_module_compression to ZIP_DEFLATED 34296 1726855345.41508: Set connection var ansible_pipelining to False 34296 1726855345.41539: variable 'ansible_shell_executable' from source: unknown 34296 1726855345.41573: variable 'ansible_connection' from source: unknown 34296 1726855345.41580: variable 'ansible_module_compression' from source: unknown 34296 1726855345.41582: variable 'ansible_shell_type' from source: unknown 34296 1726855345.41584: variable 'ansible_shell_executable' from source: unknown 34296 1726855345.41589: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855345.41591: variable 'ansible_pipelining' from source: unknown 34296 1726855345.41593: variable 'ansible_timeout' from source: unknown 34296 1726855345.41595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855345.41749: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34296 1726855345.41764: variable 'omit' from source: magic vars 34296 1726855345.41773: starting attempt loop 34296 1726855345.41779: running the handler 34296 1726855345.41806: _low_level_execute_command(): starting 34296 1726855345.42098: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34296 1726855345.42707: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34296 1726855345.42724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34296 1726855345.42742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855345.42767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34296 1726855345.42879: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' <<< 34296 1726855345.42894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34296 1726855345.42910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855345.43092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34296 1726855345.44718: stdout chunk (state=3): >>>/root <<< 34296 1726855345.44885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34296 1726855345.44891: stdout chunk (state=3): >>><<< 34296 1726855345.44893: stderr chunk (state=3): >>><<< 34296 1726855345.44915: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34296 1726855345.44948: _low_level_execute_command(): starting 34296 1726855345.44993: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855345.4493616-34361-64298679482497 `" && echo ansible-tmp-1726855345.4493616-34361-64298679482497="` echo /root/.ansible/tmp/ansible-tmp-1726855345.4493616-34361-64298679482497 `" ) && sleep 0' 34296 1726855345.46636: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855345.46718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855345.46940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK <<< 34296 1726855345.46970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855345.47142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34296 1726855345.49109: stdout chunk (state=3): >>>ansible-tmp-1726855345.4493616-34361-64298679482497=/root/.ansible/tmp/ansible-tmp-1726855345.4493616-34361-64298679482497 <<< 34296 1726855345.49214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34296 1726855345.49251: stderr chunk (state=3): >>><<< 34296 1726855345.49254: stdout chunk (state=3): >>><<< 34296 1726855345.49271: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855345.4493616-34361-64298679482497=/root/.ansible/tmp/ansible-tmp-1726855345.4493616-34361-64298679482497 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34296 1726855345.49551: variable 'ansible_module_compression' from source: unknown 34296 1726855345.49554: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-342968v_83h8s/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34296 1726855345.49675: variable 'ansible_facts' from source: unknown 34296 1726855345.50141: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855345.4493616-34361-64298679482497/AnsiballZ_setup.py 34296 1726855345.50386: Sending initial data 34296 1726855345.50428: Sent initial data (153 bytes) 34296 1726855345.51771: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855345.51924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855345.51999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK <<< 34296 1726855345.52015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855345.52119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34296 1726855345.53826: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 34296 1726855345.53879: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-342968v_83h8s/tmpeplr9bkh /root/.ansible/tmp/ansible-tmp-1726855345.4493616-34361-64298679482497/AnsiballZ_setup.py <<< 34296 1726855345.53882: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855345.4493616-34361-64298679482497/AnsiballZ_setup.py" <<< 34296 1726855345.53943: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-342968v_83h8s/tmpeplr9bkh" to remote "/root/.ansible/tmp/ansible-tmp-1726855345.4493616-34361-64298679482497/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855345.4493616-34361-64298679482497/AnsiballZ_setup.py" <<< 34296 1726855345.56813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34296 1726855345.56932: stderr chunk (state=3): >>><<< 34296 1726855345.56935: stdout chunk (state=3): >>><<< 34296 1726855345.56937: done transferring module to remote 34296 1726855345.56939: _low_level_execute_command(): starting 34296 1726855345.56941: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855345.4493616-34361-64298679482497/ /root/.ansible/tmp/ansible-tmp-1726855345.4493616-34361-64298679482497/AnsiballZ_setup.py && sleep 0' 34296 1726855345.58052: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855345.58207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' <<< 34296 1726855345.58221: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34296 1726855345.58373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855345.58459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34296 1726855345.60474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34296 1726855345.60528: stderr chunk (state=3): >>><<< 34296 1726855345.60531: stdout chunk (state=3): >>><<< 34296 1726855345.60549: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34296 1726855345.60552: _low_level_execute_command(): starting 34296 1726855345.60558: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855345.4493616-34361-64298679482497/AnsiballZ_setup.py && sleep 0' 34296 1726855345.61224: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34296 1726855345.61231: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34296 1726855345.61242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855345.61256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34296 1726855345.61281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 <<< 34296 1726855345.61289: stderr chunk (state=3): >>>debug2: match not found <<< 34296 1726855345.61390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK <<< 34296 1726855345.61408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855345.61518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34296 1726855345.64307: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34296 1726855345.64423: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # <<< 34296 1726855345.64628: stdout chunk (state=3): >>>import '_weakref' # import '_io' # import 'marshal' # import 'posix' # <<< 34296 1726855345.64631: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 34296 1726855345.64633: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855345.64664: stdout chunk (state=3): >>>import '_codecs' # <<< 34296 1726855345.64685: stdout chunk (state=3): >>>import 'codecs' # <<< 34296 1726855345.64875: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af91bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af918bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af91bea50> import '_signal' # import '_abc' # import 'abc' # <<< 34296 1726855345.64892: stdout chunk (state=3): >>>import 'io' # <<< 34296 1726855345.64924: stdout chunk (state=3): >>>import '_stat' # <<< 34296 1726855345.64939: stdout chunk (state=3): >>>import 'stat' # <<< 34296 1726855345.65111: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # <<< 34296 1726855345.65145: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 34296 1726855345.65171: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 34296 1726855345.65260: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af91cd130> <<< 34296 1726855345.65313: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 34296 1726855345.65330: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855345.65358: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af91cdfa0> <<< 34296 1726855345.65478: stdout chunk (state=3): >>>import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34296 1726855345.66068: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 34296 1726855345.66309: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8febe60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8febef0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 34296 1726855345.66504: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af9023860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af9023ef0> import '_collections' # <<< 34296 1726855345.66558: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af9003b30> <<< 34296 1726855345.66561: stdout chunk (state=3): >>>import '_functools' # <<< 34296 1726855345.66597: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af9001220> <<< 34296 1726855345.66686: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8fe9010> <<< 34296 1726855345.66710: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 34296 1726855345.66724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 34296 1726855345.66806: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 34296 1726855345.66818: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 34296 1726855345.66846: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af90437a0> <<< 34296 1726855345.66864: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af90423c0> <<< 34296 1726855345.66892: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 34296 1726855345.67105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af90020f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8fea8d0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af90787d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8fe8290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af9078c80> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af9078b30> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af9078f20> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8fe6db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 34296 1726855345.67198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af90795e0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af90792b0> import 'importlib.machinery' # <<< 34296 1726855345.67221: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 34296 1726855345.67225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 34296 1726855345.67254: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af907a4b0> <<< 34296 1726855345.67257: stdout chunk (state=3): >>>import 'importlib.util' # <<< 34296 1726855345.67260: stdout chunk (state=3): >>>import 'runpy' # <<< 34296 1726855345.67285: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 34296 1726855345.67314: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 34296 1726855345.67340: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af90906b0> <<< 34296 1726855345.67518: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af9091d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af9092c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af9093260> <<< 34296 1726855345.67522: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af9092150> <<< 34296 1726855345.67536: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 34296 1726855345.67552: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 34296 1726855345.67583: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 34296 1726855345.67633: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af9093ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af9093410> <<< 34296 1726855345.67645: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af907a420> <<< 34296 1726855345.67670: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 34296 1726855345.67805: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8d93c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 34296 1726855345.67862: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8dbc6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8dbc410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8dbc6e0> <<< 34296 1726855345.67884: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 34296 1726855345.68103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34296 1726855345.68232: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8dbd010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 34296 1726855345.68236: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8dbda00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8dbc8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8d91df0> <<< 34296 1726855345.68263: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 34296 1726855345.68281: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 34296 1726855345.68315: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 34296 1726855345.68398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 34296 1726855345.68410: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8dbee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8dbdb50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af907abd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34296 1726855345.68457: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855345.68469: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 34296 1726855345.68510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 34296 1726855345.68531: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8de71a0> <<< 34296 1726855345.68592: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 34296 1726855345.68608: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855345.68705: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8e0b500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 34296 1726855345.68753: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34296 1726855345.68888: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8e6c230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 34296 1726855345.69003: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 34296 1726855345.69023: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8e6e990> <<< 34296 1726855345.69204: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8e6c350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8e31250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8725370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8e0a300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8dbfd70> <<< 34296 1726855345.69361: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 34296 1726855345.69375: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8af8725610> <<< 34296 1726855345.69652: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_ajq2gvf7/ansible_setup_payload.zip' # zipimport: zlib available <<< 34296 1726855345.69780: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.69859: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34296 1726855345.69930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 34296 1726855345.70086: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 34296 1726855345.70091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af878f0b0> import '_typing' # <<< 34296 1726855345.70155: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af876dfa0> <<< 34296 1726855345.70172: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af876d130> # zipimport: zlib available <<< 34296 1726855345.70206: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 34296 1726855345.70309: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 34296 1726855345.70312: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.71660: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.72823: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af878ce00> <<< 34296 1726855345.73028: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af87bea50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af87be7e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af87be0f0> <<< 34296 1726855345.73032: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 34296 1726855345.73035: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 34296 1726855345.73076: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af87be540> <<< 34296 1726855345.73079: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af878fb30> import 'atexit' # <<< 34296 1726855345.73106: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af87bf7a0> <<< 34296 1726855345.73129: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 34296 1726855345.73143: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af87bf9e0> <<< 34296 1726855345.73303: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af87bfef0> import 'pwd' # <<< 34296 1726855345.73517: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8629d00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af862b920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af862c2f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 34296 1726855345.73521: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af862d490> <<< 34296 1726855345.73542: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 34296 1726855345.73574: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 34296 1726855345.73596: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 34296 1726855345.73806: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af862ff20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8fe6ea0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af862e1e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 34296 1726855345.73925: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 34296 1726855345.73944: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 34296 1726855345.73965: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8637e60> import '_tokenize' # <<< 34296 1726855345.74037: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8636930> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af86366c0> <<< 34296 1726855345.74055: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 34296 1726855345.74067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 34296 1726855345.74301: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8636c00> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af862e6f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af867bf80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af867c1d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 34296 1726855345.74327: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 34296 1726855345.74340: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af867dd90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af867db50> <<< 34296 1726855345.74357: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34296 1726855345.74386: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 34296 1726855345.74440: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8680350> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af867e480> <<< 34296 1726855345.74461: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34296 1726855345.74513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855345.74531: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 34296 1726855345.74703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8683a40> <<< 34296 1726855345.74720: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8680410> <<< 34296 1726855345.74783: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8684830> <<< 34296 1726855345.75004: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8684c50> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8684b90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af867c4d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af850c470> <<< 34296 1726855345.75145: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 34296 1726855345.75169: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af850d550> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8686c30> <<< 34296 1726855345.75503: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8687f80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8686840> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 34296 1726855345.75613: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.75729: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.76284: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.76819: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 34296 1726855345.76835: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 34296 1726855345.76855: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 34296 1726855345.76875: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855345.76929: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8515790> <<< 34296 1726855345.77024: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 34296 1726855345.77052: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8516540> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af850d730> <<< 34296 1726855345.77159: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # <<< 34296 1726855345.77169: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.77301: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.77462: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8516300> <<< 34296 1726855345.77596: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.77921: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.78359: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.78463: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.78505: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 34296 1726855345.78516: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.78600: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 34296 1726855345.78669: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.78741: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 34296 1726855345.78800: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 34296 1726855345.78825: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.78871: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 34296 1726855345.79146: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.79532: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 34296 1726855345.79613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af85176b0> # zipimport: zlib available # zipimport: zlib available <<< 34296 1726855345.79643: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 34296 1726855345.79662: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 34296 1726855345.79673: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.79714: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.79759: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 34296 1726855345.79772: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.79889: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.79892: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.79907: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.79981: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 34296 1726855345.80020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855345.80113: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 34296 1726855345.80202: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8522270> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af851d940> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 34296 1726855345.80258: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.80316: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.80339: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.80430: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 34296 1726855345.80439: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 34296 1726855345.80774: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 34296 1726855345.80777: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 34296 1726855345.80779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 34296 1726855345.80783: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af860abd0> <<< 34296 1726855345.80785: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af87ea8a0> <<< 34296 1726855345.80792: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af85223c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af87255e0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 34296 1726855345.80795: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.80809: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 34296 1726855345.80862: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 34296 1726855345.80884: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.80968: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available <<< 34296 1726855345.81028: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.81043: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.81070: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.81109: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.81147: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.81188: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.81219: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 34296 1726855345.81236: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.81312: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.81377: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.81509: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.81529: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 34296 1726855345.81624: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.81794: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.81837: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.81890: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 34296 1726855345.81909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 34296 1726855345.81975: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 34296 1726855345.82021: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af85b24e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 34296 1726855345.82075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 34296 1726855345.82098: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 34296 1726855345.82193: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af81c0290> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af81c0560> <<< 34296 1726855345.82240: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af859c080> <<< 34296 1726855345.82257: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af85b3020> <<< 34296 1726855345.82290: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af85b0bc0> <<< 34296 1726855345.82520: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af85b0650> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af81c35c0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af81c2e70> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af81c3050> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af81c22d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 34296 1726855345.82646: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 34296 1726855345.82662: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af81c3710> <<< 34296 1726855345.82679: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 34296 1726855345.82703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 34296 1726855345.82738: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af82221b0> <<< 34296 1726855345.82770: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af81c0c50> <<< 34296 1726855345.82791: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af85b08f0> <<< 34296 1726855345.82829: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 34296 1726855345.82832: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.82900: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 34296 1726855345.82941: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.82944: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.82976: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 34296 1726855345.82996: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.83049: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.83158: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 34296 1726855345.83162: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34296 1726855345.83164: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # <<< 34296 1726855345.83169: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.83171: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.83200: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 34296 1726855345.83211: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.83257: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.83304: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 34296 1726855345.83317: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.83354: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.83402: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 34296 1726855345.83408: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.83592: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.83595: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.83598: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.83657: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 34296 1726855345.83708: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.84138: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.84584: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 34296 1726855345.84591: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.84638: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.84694: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.84829: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 34296 1726855345.84836: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.84863: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 34296 1726855345.84892: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.84960: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 34296 1726855345.84966: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.84993: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.85021: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 34296 1726855345.85095: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.85099: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 34296 1726855345.85189: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.85296: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 34296 1726855345.85321: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8222420> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 34296 1726855345.85462: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 34296 1726855345.85488: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8223080> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 34296 1726855345.85552: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.85611: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 34296 1726855345.85622: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.85771: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.85806: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 34296 1726855345.85867: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.85882: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.86053: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 34296 1726855345.86094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 34296 1726855345.86166: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 34296 1726855345.86227: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 34296 1726855345.86350: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af825e510> <<< 34296 1726855345.86423: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af824c890> <<< 34296 1726855345.86430: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 34296 1726855345.86516: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.86547: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 34296 1726855345.86603: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.86641: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.86770: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.86832: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.87190: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available <<< 34296 1726855345.87249: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 34296 1726855345.87315: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af82722a0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8272300> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 34296 1726855345.87357: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 34296 1726855345.87413: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.87501: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 34296 1726855345.87757: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.88003: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 34296 1726855345.88006: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.88193: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.88289: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.88353: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.88450: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available<<< 34296 1726855345.88505: stdout chunk (state=3): >>> <<< 34296 1726855345.88525: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.88573: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.88761: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.88979: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 34296 1726855345.89055: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.89175: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.89356: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 34296 1726855345.89446: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34296 1726855345.89468: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.90443: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.90792: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 34296 1726855345.90815: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 34296 1726855345.90895: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.91014: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 34296 1726855345.91122: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.91223: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 34296 1726855345.91372: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.91718: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available <<< 34296 1726855345.91743: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 34296 1726855345.91999: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.92034: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.92368: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.92692: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 34296 1726855345.92730: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 34296 1726855345.92770: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.92813: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 34296 1726855345.92885: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.92909: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 34296 1726855345.93015: stdout chunk (state=3): >>># zipimport: zlib available<<< 34296 1726855345.93128: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.network.fc_wwn' # <<< 34296 1726855345.93171: stdout chunk (state=3): >>># zipimport: zlib available<<< 34296 1726855345.93174: stdout chunk (state=3): >>> <<< 34296 1726855345.93212: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.93247: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 34296 1726855345.93286: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.93379: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.93508: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 34296 1726855345.93686: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # <<< 34296 1726855345.93710: stdout chunk (state=3): >>> # zipimport: zlib available <<< 34296 1726855345.94157: stdout chunk (state=3): >>># zipimport: zlib available<<< 34296 1726855345.94165: stdout chunk (state=3): >>> <<< 34296 1726855345.94586: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 34296 1726855345.94602: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34296 1726855345.94671: stdout chunk (state=3): >>> <<< 34296 1726855345.94717: stdout chunk (state=3): >>># zipimport: zlib available<<< 34296 1726855345.94725: stdout chunk (state=3): >>> <<< 34296 1726855345.94817: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 34296 1726855345.94819: stdout chunk (state=3): >>> <<< 34296 1726855345.94861: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.94901: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.94971: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 34296 1726855345.95069: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # <<< 34296 1726855345.95096: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.95189: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # <<< 34296 1726855345.95195: stdout chunk (state=3): >>> <<< 34296 1726855345.95219: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.95346: stdout chunk (state=3): >>># zipimport: zlib available<<< 34296 1726855345.95468: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.network.sunos' # <<< 34296 1726855345.95473: stdout chunk (state=3): >>> <<< 34296 1726855345.95514: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.95538: stdout chunk (state=3): >>># zipimport: zlib available<<< 34296 1726855345.95545: stdout chunk (state=3): >>> <<< 34296 1726855345.95557: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 34296 1726855345.95563: stdout chunk (state=3): >>> <<< 34296 1726855345.95585: stdout chunk (state=3): >>># zipimport: zlib available<<< 34296 1726855345.95596: stdout chunk (state=3): >>> <<< 34296 1726855345.95656: stdout chunk (state=3): >>># zipimport: zlib available<<< 34296 1726855345.95663: stdout chunk (state=3): >>> <<< 34296 1726855345.95723: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 34296 1726855345.95753: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34296 1726855345.95756: stdout chunk (state=3): >>> <<< 34296 1726855345.95821: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34296 1726855345.95899: stdout chunk (state=3): >>># zipimport: zlib available<<< 34296 1726855345.95990: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34296 1726855345.96004: stdout chunk (state=3): >>> <<< 34296 1726855345.96111: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.96226: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 34296 1726855345.96248: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # <<< 34296 1726855345.96286: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 34296 1726855345.96428: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # <<< 34296 1726855345.96455: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.96776: stdout chunk (state=3): >>># zipimport: zlib available<<< 34296 1726855345.96790: stdout chunk (state=3): >>> <<< 34296 1726855345.97165: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available <<< 34296 1726855345.97229: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 34296 1726855345.97241: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.97304: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.97382: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 34296 1726855345.97384: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.97504: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.97645: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 34296 1726855345.97666: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 34296 1726855345.97751: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.97842: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 34296 1726855345.97868: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 34296 1726855345.97920: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855345.99006: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 34296 1726855345.99019: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 34296 1726855345.99064: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8073440> <<< 34296 1726855345.99099: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8071f10> <<< 34296 1726855345.99157: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8069010> <<< 34296 1726855345.99743: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDHUYt5wfRkUoXhr+uEExQ+HhovkYI5uoocZphrjWB9VMTYAPkmr8CuX+VnCXEjIcFkU4gKaqEurk7K1oCCCvMVY+8Lx5A0D1tOL7nuQ3qS/VV7QoUpqykECy83nEA8/4uH5k1zBHFre8BVrJ2+ajOt+yFNJ09xVXQYHFaYl2Fb5yrh0VPKS0HbLIX/421Zj6H4fXZh2hVLH3p7ZnvXxgY+fNXOwkbFXFpljGMQuXNjflqfOabJUQ520FAteLnBbrkmG2hLRBKB+xMKVgsWad5mNepxZioFTgRyVzX0ykevriDqvEDKfAw+FLZSyT09gS4HrWGkyEOfKtmH24gWrtXrEEoTaNCRtSBTo2YVliQabO4JiMyDbKv/l842RgOZVW2ZYRhJt9vJlQH9vqiJec+B8CFp2wWmrk86VMrbbGpCUV3kumTaD569IJ8oPsDPdTkLjmfkbkbrD8F6DHXNDMfaKTNhwGrb7az2ZI4oXtuRV5Nj/asaepJFulEPiE8MsSM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLkB2rX0xX4F/DzMEydgf7lYw3g8CdOq7oErXDwnQFxylKGWmIu8Cb6CiKkdKvj7f3AU3Wj6OPQuHXkxsl5hV9I=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMNckAa3R2XGvAGYYtN9sXs3nJFV9bRzT0omObiULH2l", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 40014 10.31.10.44 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 40014 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-44.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-44", "ansible_nodename": "ip-10-31-10-44.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "a<<< 34296 1726855345.99757: stdout chunk (state=3): >>>nsible_machine_id": "ec256a8385d0e1834efdcee0241eb89f", "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "14", "minute": "02", "second": "25", "epoch": "1726855345", "epoch_int": "1726855345", "date": "2024-09-20", "time": "14:02:25", "iso8601_micro": "2024-09-20T18:02:25.995688Z", "iso8601": "2024-09-20T18:02:25Z", "iso8601_basic": "20240920T140225995688", "iso8601_basic_short": "20240920T140225", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34296 1726855346.00745: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanu<<< 34296 1726855346.00783: stdout chunk (state=3): >>>p[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing<<< 34296 1726855346.00837: stdout chunk (state=3): >>> ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2<<< 34296 1726855346.00861: stdout chunk (state=3): >>>] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansibl<<< 34296 1726855346.00866: stdout chunk (state=3): >>>e.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 34296 1726855346.01244: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 34296 1726855346.01276: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 34296 1726855346.01307: stdout chunk (state=3): >>># destroy _bz2 <<< 34296 1726855346.01311: stdout chunk (state=3): >>># destroy _compression # destroy _lzma # destroy _blake2 <<< 34296 1726855346.01320: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 34296 1726855346.01369: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 34296 1726855346.01429: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder <<< 34296 1726855346.01443: stdout chunk (state=3): >>># destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale <<< 34296 1726855346.01478: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 34296 1726855346.01481: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 34296 1726855346.01527: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 34296 1726855346.01633: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq <<< 34296 1726855346.01650: stdout chunk (state=3): >>># destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 34296 1726855346.02168: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 34296 1726855346.02178: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34296 1726855346.02295: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections <<< 34296 1726855346.02336: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 34296 1726855346.02347: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34296 1726855346.02447: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 34296 1726855346.02492: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 34296 1726855346.02511: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 34296 1726855346.02531: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator <<< 34296 1726855346.02540: stdout chunk (state=3): >>># destroy _sre # destroy _string # destroy re # destroy itertools <<< 34296 1726855346.02604: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 34296 1726855346.03370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.44 closed. <<< 34296 1726855346.03374: stdout chunk (state=3): >>><<< 34296 1726855346.03376: stderr chunk (state=3): >>><<< 34296 1726855346.03610: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af91bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af918bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af91bea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af91cd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af91cdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8febe60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8febef0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af9023860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af9023ef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af9003b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af9001220> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8fe9010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af90437a0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af90423c0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af90020f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8fea8d0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af90787d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8fe8290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af9078c80> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af9078b30> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af9078f20> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8fe6db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af90795e0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af90792b0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af907a4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af90906b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af9091d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af9092c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af9093260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af9092150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af9093ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af9093410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af907a420> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8d93c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8dbc6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8dbc410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8dbc6e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8dbd010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8dbda00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8dbc8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8d91df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8dbee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8dbdb50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af907abd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8de71a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8e0b500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8e6c230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8e6e990> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8e6c350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8e31250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8725370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8e0a300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8dbfd70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8af8725610> # zipimport: found 103 names in '/tmp/ansible_setup_payload_ajq2gvf7/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af878f0b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af876dfa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af876d130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af878ce00> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af87bea50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af87be7e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af87be0f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af87be540> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af878fb30> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af87bf7a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af87bf9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af87bfef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8629d00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af862b920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af862c2f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af862d490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af862ff20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8fe6ea0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af862e1e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8637e60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8636930> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af86366c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8636c00> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af862e6f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af867bf80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af867c1d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af867dd90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af867db50> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8680350> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af867e480> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8683a40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8680410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8684830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8684c50> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8684b90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af867c4d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af850c470> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af850d550> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8686c30> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8687f80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8686840> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8515790> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8516540> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af850d730> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8516300> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af85176b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8522270> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af851d940> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af860abd0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af87ea8a0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af85223c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af87255e0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af85b24e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af81c0290> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af81c0560> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af859c080> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af85b3020> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af85b0bc0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af85b0650> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af81c35c0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af81c2e70> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af81c3050> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af81c22d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af81c3710> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af82221b0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af81c0c50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af85b08f0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8222420> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8223080> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af825e510> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af824c890> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af82722a0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8272300> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8af8073440> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8071f10> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8af8069010> {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDHUYt5wfRkUoXhr+uEExQ+HhovkYI5uoocZphrjWB9VMTYAPkmr8CuX+VnCXEjIcFkU4gKaqEurk7K1oCCCvMVY+8Lx5A0D1tOL7nuQ3qS/VV7QoUpqykECy83nEA8/4uH5k1zBHFre8BVrJ2+ajOt+yFNJ09xVXQYHFaYl2Fb5yrh0VPKS0HbLIX/421Zj6H4fXZh2hVLH3p7ZnvXxgY+fNXOwkbFXFpljGMQuXNjflqfOabJUQ520FAteLnBbrkmG2hLRBKB+xMKVgsWad5mNepxZioFTgRyVzX0ykevriDqvEDKfAw+FLZSyT09gS4HrWGkyEOfKtmH24gWrtXrEEoTaNCRtSBTo2YVliQabO4JiMyDbKv/l842RgOZVW2ZYRhJt9vJlQH9vqiJec+B8CFp2wWmrk86VMrbbGpCUV3kumTaD569IJ8oPsDPdTkLjmfkbkbrD8F6DHXNDMfaKTNhwGrb7az2ZI4oXtuRV5Nj/asaepJFulEPiE8MsSM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLkB2rX0xX4F/DzMEydgf7lYw3g8CdOq7oErXDwnQFxylKGWmIu8Cb6CiKkdKvj7f3AU3Wj6OPQuHXkxsl5hV9I=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMNckAa3R2XGvAGYYtN9sXs3nJFV9bRzT0omObiULH2l", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.15.200 40014 10.31.10.44 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.15.200 40014 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-44.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-44", "ansible_nodename": "ip-10-31-10-44.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec256a8385d0e1834efdcee0241eb89f", "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "14", "minute": "02", "second": "25", "epoch": "1726855345", "epoch_int": "1726855345", "date": "2024-09-20", "time": "14:02:25", "iso8601_micro": "2024-09-20T18:02:25.995688Z", "iso8601": "2024-09-20T18:02:25Z", "iso8601_basic": "20240920T140225995688", "iso8601_basic_short": "20240920T140225", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.44 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 34296 1726855346.04531: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855345.4493616-34361-64298679482497/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34296 1726855346.04534: _low_level_execute_command(): starting 34296 1726855346.04536: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855345.4493616-34361-64298679482497/ > /dev/null 2>&1 && sleep 0' 34296 1726855346.04668: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855346.04672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855346.04674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855346.04676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855346.04735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' <<< 34296 1726855346.04737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34296 1726855346.04739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855346.04824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34296 1726855346.07158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34296 1726855346.07182: stderr chunk (state=3): >>><<< 34296 1726855346.07185: stdout chunk (state=3): >>><<< 34296 1726855346.07203: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34296 1726855346.07209: handler run complete 34296 1726855346.07243: variable 'ansible_facts' from source: unknown 34296 1726855346.07282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.07355: variable 'ansible_facts' from source: unknown 34296 1726855346.07389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.07422: attempt loop complete, returning result 34296 1726855346.07425: _execute() done 34296 1726855346.07429: dumping result to json 34296 1726855346.07440: done dumping result, returning 34296 1726855346.07446: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcc66-ac2b-a97a-1acc-000000000158] 34296 1726855346.07449: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000158 34296 1726855346.07592: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000158 ok: [managed_node1] 34296 1726855346.07690: no more pending results, returning what we have 34296 1726855346.07694: results queue empty 34296 1726855346.07694: checking for any_errors_fatal 34296 1726855346.07697: done checking for any_errors_fatal 34296 1726855346.07697: checking for max_fail_percentage 34296 1726855346.07699: done checking for max_fail_percentage 34296 1726855346.07700: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.07700: done checking to see if all hosts have failed 34296 1726855346.07701: getting the remaining hosts for this loop 34296 1726855346.07702: done getting the remaining hosts for this loop 34296 1726855346.07706: getting the next task for host managed_node1 34296 1726855346.07713: done getting next task for host managed_node1 34296 1726855346.07716: ^ task is: TASK: Check if system is ostree 34296 1726855346.07718: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.07722: getting variables 34296 1726855346.07723: in VariableManager get_vars() 34296 1726855346.07748: Calling all_inventory to load vars for managed_node1 34296 1726855346.07750: Calling groups_inventory to load vars for managed_node1 34296 1726855346.07752: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.07761: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.07763: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.07768: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.07921: WORKER PROCESS EXITING 34296 1726855346.07931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.08046: done with get_vars() 34296 1726855346.08054: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 14:02:26 -0400 (0:00:00.747) 0:00:02.118 ****** 34296 1726855346.08123: entering _queue_task() for managed_node1/stat 34296 1726855346.08343: worker is 1 (out of 1 available) 34296 1726855346.08358: exiting _queue_task() for managed_node1/stat 34296 1726855346.08369: done queuing things up, now waiting for results queue to drain 34296 1726855346.08370: waiting for pending results... 34296 1726855346.08519: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 34296 1726855346.08589: in run() - task 0affcc66-ac2b-a97a-1acc-00000000015a 34296 1726855346.08600: variable 'ansible_search_path' from source: unknown 34296 1726855346.08604: variable 'ansible_search_path' from source: unknown 34296 1726855346.08632: calling self._execute() 34296 1726855346.08690: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.08694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.08706: variable 'omit' from source: magic vars 34296 1726855346.09050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34296 1726855346.09230: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34296 1726855346.09262: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34296 1726855346.09292: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34296 1726855346.09318: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34296 1726855346.09410: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34296 1726855346.09428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34296 1726855346.09445: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34296 1726855346.09470: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34296 1726855346.09560: Evaluated conditional (not __network_is_ostree is defined): True 34296 1726855346.09563: variable 'omit' from source: magic vars 34296 1726855346.09595: variable 'omit' from source: magic vars 34296 1726855346.09622: variable 'omit' from source: magic vars 34296 1726855346.09641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34296 1726855346.09661: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34296 1726855346.09679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34296 1726855346.09696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34296 1726855346.09705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34296 1726855346.09727: variable 'inventory_hostname' from source: host vars for 'managed_node1' 34296 1726855346.09730: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.09733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.09798: Set connection var ansible_shell_type to sh 34296 1726855346.09805: Set connection var ansible_shell_executable to /bin/sh 34296 1726855346.09809: Set connection var ansible_connection to ssh 34296 1726855346.09892: Set connection var ansible_timeout to 10 34296 1726855346.09895: Set connection var ansible_module_compression to ZIP_DEFLATED 34296 1726855346.09897: Set connection var ansible_pipelining to False 34296 1726855346.09900: variable 'ansible_shell_executable' from source: unknown 34296 1726855346.09902: variable 'ansible_connection' from source: unknown 34296 1726855346.09905: variable 'ansible_module_compression' from source: unknown 34296 1726855346.09907: variable 'ansible_shell_type' from source: unknown 34296 1726855346.09909: variable 'ansible_shell_executable' from source: unknown 34296 1726855346.09911: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.09913: variable 'ansible_pipelining' from source: unknown 34296 1726855346.09915: variable 'ansible_timeout' from source: unknown 34296 1726855346.09917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.09965: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34296 1726855346.09975: variable 'omit' from source: magic vars 34296 1726855346.09979: starting attempt loop 34296 1726855346.09982: running the handler 34296 1726855346.09995: _low_level_execute_command(): starting 34296 1726855346.10002: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34296 1726855346.10518: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855346.10522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 34296 1726855346.10525: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found <<< 34296 1726855346.10527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855346.10580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' <<< 34296 1726855346.10582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34296 1726855346.10585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855346.10658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34296 1726855346.13082: stdout chunk (state=3): >>>/root <<< 34296 1726855346.13219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34296 1726855346.13250: stderr chunk (state=3): >>><<< 34296 1726855346.13254: stdout chunk (state=3): >>><<< 34296 1726855346.13276: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34296 1726855346.13294: _low_level_execute_command(): starting 34296 1726855346.13299: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726855346.1327639-34395-55604447234336 `" && echo ansible-tmp-1726855346.1327639-34395-55604447234336="` echo /root/.ansible/tmp/ansible-tmp-1726855346.1327639-34395-55604447234336 `" ) && sleep 0' 34296 1726855346.13774: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34296 1726855346.13777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found <<< 34296 1726855346.13780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855346.13783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855346.13796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855346.13849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' <<< 34296 1726855346.13853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34296 1726855346.13856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855346.13923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34296 1726855346.16758: stdout chunk (state=3): >>>ansible-tmp-1726855346.1327639-34395-55604447234336=/root/.ansible/tmp/ansible-tmp-1726855346.1327639-34395-55604447234336 <<< 34296 1726855346.16912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34296 1726855346.16940: stderr chunk (state=3): >>><<< 34296 1726855346.16943: stdout chunk (state=3): >>><<< 34296 1726855346.16958: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726855346.1327639-34395-55604447234336=/root/.ansible/tmp/ansible-tmp-1726855346.1327639-34395-55604447234336 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34296 1726855346.17011: variable 'ansible_module_compression' from source: unknown 34296 1726855346.17055: ANSIBALLZ: Using lock for stat 34296 1726855346.17059: ANSIBALLZ: Acquiring lock 34296 1726855346.17061: ANSIBALLZ: Lock acquired: 139782747429904 34296 1726855346.17064: ANSIBALLZ: Creating module 34296 1726855346.24825: ANSIBALLZ: Writing module into payload 34296 1726855346.24893: ANSIBALLZ: Writing module 34296 1726855346.24912: ANSIBALLZ: Renaming module 34296 1726855346.24917: ANSIBALLZ: Done creating module 34296 1726855346.24934: variable 'ansible_facts' from source: unknown 34296 1726855346.24989: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726855346.1327639-34395-55604447234336/AnsiballZ_stat.py 34296 1726855346.25096: Sending initial data 34296 1726855346.25099: Sent initial data (152 bytes) 34296 1726855346.25568: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855346.25572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855346.25576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34296 1726855346.25578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found <<< 34296 1726855346.25580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855346.25632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' <<< 34296 1726855346.25635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34296 1726855346.25661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855346.25723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34296 1726855346.28028: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34296 1726855346.28089: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34296 1726855346.28143: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-342968v_83h8s/tmp2x11xjba /root/.ansible/tmp/ansible-tmp-1726855346.1327639-34395-55604447234336/AnsiballZ_stat.py <<< 34296 1726855346.28147: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726855346.1327639-34395-55604447234336/AnsiballZ_stat.py" <<< 34296 1726855346.28197: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-342968v_83h8s/tmp2x11xjba" to remote "/root/.ansible/tmp/ansible-tmp-1726855346.1327639-34395-55604447234336/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726855346.1327639-34395-55604447234336/AnsiballZ_stat.py" <<< 34296 1726855346.28962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34296 1726855346.29000: stderr chunk (state=3): >>><<< 34296 1726855346.29004: stdout chunk (state=3): >>><<< 34296 1726855346.29023: done transferring module to remote 34296 1726855346.29035: _low_level_execute_command(): starting 34296 1726855346.29040: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726855346.1327639-34395-55604447234336/ /root/.ansible/tmp/ansible-tmp-1726855346.1327639-34395-55604447234336/AnsiballZ_stat.py && sleep 0' 34296 1726855346.29516: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34296 1726855346.29522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855346.29524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found <<< 34296 1726855346.29526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855346.29528: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34296 1726855346.29530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found <<< 34296 1726855346.29532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855346.29578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' <<< 34296 1726855346.29581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855346.29646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34296 1726855346.32009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34296 1726855346.32013: stdout chunk (state=3): >>><<< 34296 1726855346.32016: stderr chunk (state=3): >>><<< 34296 1726855346.32018: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34296 1726855346.32020: _low_level_execute_command(): starting 34296 1726855346.32023: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726855346.1327639-34395-55604447234336/AnsiballZ_stat.py && sleep 0' 34296 1726855346.32478: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34296 1726855346.32482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855346.32484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration <<< 34296 1726855346.32496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855346.32537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' <<< 34296 1726855346.32540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855346.32616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34296 1726855346.35369: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34296 1726855346.35421: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 34296 1726855346.35512: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 34296 1726855346.35542: stdout chunk (state=3): >>>import 'posix' # <<< 34296 1726855346.35594: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 34296 1726855346.35604: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 34296 1726855346.35661: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 34296 1726855346.35681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 34296 1726855346.35725: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 34296 1726855346.35770: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f90184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8fe7b30> <<< 34296 1726855346.35807: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f901aa50> <<< 34296 1726855346.35851: stdout chunk (state=3): >>>import '_signal' # <<< 34296 1726855346.35856: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 34296 1726855346.35903: stdout chunk (state=3): >>>import 'io' # <<< 34296 1726855346.35919: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 34296 1726855346.36005: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 34296 1726855346.36058: stdout chunk (state=3): >>>import 'os' # <<< 34296 1726855346.36076: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 34296 1726855346.36112: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 34296 1726855346.36150: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 34296 1726855346.36153: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e2d130> <<< 34296 1726855346.36230: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855346.36234: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e2dfa0> <<< 34296 1726855346.36279: stdout chunk (state=3): >>>import 'site' # <<< 34296 1726855346.36284: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34296 1726855346.36514: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 34296 1726855346.36551: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855346.36577: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 34296 1726855346.36618: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 34296 1726855346.36659: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e6be90> <<< 34296 1726855346.36712: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 34296 1726855346.36751: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e6bf50> <<< 34296 1726855346.36754: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 34296 1726855346.36797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 34296 1726855346.36800: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 34296 1726855346.36876: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855346.36881: stdout chunk (state=3): >>>import 'itertools' # <<< 34296 1726855346.36918: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ea3830> <<< 34296 1726855346.36940: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ea3ec0> <<< 34296 1726855346.36951: stdout chunk (state=3): >>>import '_collections' # <<< 34296 1726855346.36993: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e83b60> <<< 34296 1726855346.37025: stdout chunk (state=3): >>>import '_functools' # <<< 34296 1726855346.37029: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e81280> <<< 34296 1726855346.37113: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e69040> <<< 34296 1726855346.37167: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 34296 1726855346.37189: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 34296 1726855346.37229: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 34296 1726855346.37240: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 34296 1726855346.37319: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ec37d0> <<< 34296 1726855346.37322: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ec23f0> <<< 34296 1726855346.37332: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e82150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ec0c20> <<< 34296 1726855346.37397: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 34296 1726855346.37443: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ef8860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e682c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 34296 1726855346.37458: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8ef8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ef8bc0> <<< 34296 1726855346.37505: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so'<<< 34296 1726855346.37560: stdout chunk (state=3): >>> # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8ef8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e66de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855346.37563: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 34296 1726855346.37612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ef9610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ef92e0> import 'importlib.machinery' # <<< 34296 1726855346.37670: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8efa510> <<< 34296 1726855346.37701: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 34296 1726855346.37714: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 34296 1726855346.37774: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 34296 1726855346.37797: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8f10710> import 'errno' # <<< 34296 1726855346.37843: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8f11df0> <<< 34296 1726855346.37873: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 34296 1726855346.37934: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8f12c90> <<< 34296 1726855346.37938: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8f132f0> <<< 34296 1726855346.37983: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8f121e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 34296 1726855346.38023: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 34296 1726855346.38027: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8f13d70> <<< 34296 1726855346.38048: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8f134a0> <<< 34296 1726855346.38091: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8efa540> <<< 34296 1726855346.38144: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 34296 1726855346.38147: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 34296 1726855346.38163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 34296 1726855346.38213: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 34296 1726855346.38250: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8c93bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8cbc6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8cbc410> <<< 34296 1726855346.38304: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8cbc6e0> <<< 34296 1726855346.38323: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 34296 1726855346.38389: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34296 1726855346.38512: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8cbd010> <<< 34296 1726855346.38646: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8cbda00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8cbc8c0> <<< 34296 1726855346.38692: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8c91d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 34296 1726855346.38724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 34296 1726855346.38783: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8cbee10> <<< 34296 1726855346.38788: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8cbdb50> <<< 34296 1726855346.38816: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8efac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34296 1726855346.38885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855346.38921: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 34296 1726855346.38955: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 34296 1726855346.38958: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ce71a0> <<< 34296 1726855346.39037: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855346.39061: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 34296 1726855346.39118: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8d0b530> <<< 34296 1726855346.39134: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 34296 1726855346.39174: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34296 1726855346.39240: stdout chunk (state=3): >>>import 'ntpath' # <<< 34296 1726855346.39276: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8d6c290> <<< 34296 1726855346.39295: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 34296 1726855346.39322: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 34296 1726855346.39375: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 34296 1726855346.39455: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8d6e9f0> <<< 34296 1726855346.39528: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8d6c3b0> <<< 34296 1726855346.39556: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8d312e0> <<< 34296 1726855346.39611: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8b753a0> <<< 34296 1726855346.39621: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8d0a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8cbfd70> <<< 34296 1726855346.39723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 34296 1726855346.39744: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa3f8b75640> <<< 34296 1726855346.39914: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_8f0q8g1t/ansible_stat_payload.zip' # zipimport: zlib available <<< 34296 1726855346.40073: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.40077: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 34296 1726855346.40085: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 34296 1726855346.40127: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34296 1726855346.40198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 34296 1726855346.40240: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8bcb080> <<< 34296 1726855346.40261: stdout chunk (state=3): >>>import '_typing' # <<< 34296 1726855346.40429: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ba9f70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ba9100> <<< 34296 1726855346.40446: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.40490: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 34296 1726855346.40528: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 34296 1726855346.40531: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.42060: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.43070: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8bc8f20> <<< 34296 1726855346.43104: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 34296 1726855346.43130: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 34296 1726855346.43169: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 34296 1726855346.43222: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8bf2a50> <<< 34296 1726855346.43271: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8bf27e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8bf20f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 34296 1726855346.43346: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 34296 1726855346.43351: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8bf2540> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8bcbaa0> import 'atexit' # <<< 34296 1726855346.43403: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8bf37d0> <<< 34296 1726855346.43406: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8bf39e0> <<< 34296 1726855346.43531: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 34296 1726855346.43534: stdout chunk (state=3): >>>import '_locale' # <<< 34296 1726855346.43570: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8bf3f20> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 34296 1726855346.43620: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 34296 1726855346.43668: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8515d00> <<< 34296 1726855346.43749: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8517920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 34296 1726855346.43786: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f85182f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 34296 1726855346.43819: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8519490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 34296 1726855346.43901: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 34296 1726855346.43924: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f851bf50> <<< 34296 1726855346.43993: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8f12c00> <<< 34296 1726855346.44033: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f851a210> <<< 34296 1726855346.44036: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 34296 1726855346.44039: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 34296 1726855346.44130: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 34296 1726855346.44180: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8523e30> <<< 34296 1726855346.44202: stdout chunk (state=3): >>>import '_tokenize' # <<< 34296 1726855346.44249: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8522900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8522660> <<< 34296 1726855346.44292: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 34296 1726855346.44379: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8522bd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f851b440> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f856bef0> <<< 34296 1726855346.44460: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f856c140> <<< 34296 1726855346.44504: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 34296 1726855346.44723: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f856dc10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f856d9d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34296 1726855346.44742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f85701a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f856e300> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34296 1726855346.44779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855346.44804: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 34296 1726855346.44830: stdout chunk (state=3): >>>import '_string' # <<< 34296 1726855346.44854: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8573950> <<< 34296 1726855346.44975: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8570350> <<< 34296 1726855346.45050: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8574740> <<< 34296 1726855346.45104: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8574b90> <<< 34296 1726855346.45268: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8574b30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f856c290> <<< 34296 1726855346.45272: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 34296 1726855346.45302: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f85fc320> <<< 34296 1726855346.45406: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f85fd430> <<< 34296 1726855346.45442: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8576ab0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 34296 1726855346.45512: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8577e60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f85766f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 34296 1726855346.45600: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.45689: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.45794: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available <<< 34296 1726855346.45798: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 34296 1726855346.45816: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.45864: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.46036: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.46534: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.47082: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 34296 1726855346.47194: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 34296 1726855346.47223: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8405670> <<< 34296 1726855346.47275: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 34296 1726855346.47313: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f84063f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f85fd6a0> <<< 34296 1726855346.47374: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 34296 1726855346.47439: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 34296 1726855346.47451: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.47596: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.47759: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8406b10> # zipimport: zlib available <<< 34296 1726855346.48186: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.48621: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.48695: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.48777: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 34296 1726855346.48786: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.48816: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.48868: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 34296 1726855346.48885: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.48934: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.49024: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 34296 1726855346.49036: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 34296 1726855346.49050: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.49096: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.49377: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 34296 1726855346.49478: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.49840: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 34296 1726855346.49916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 34296 1726855346.49939: stdout chunk (state=3): >>>import '_ast' # <<< 34296 1726855346.50029: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f84077a0> <<< 34296 1726855346.50046: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.50162: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.50249: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 34296 1726855346.50297: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 34296 1726855346.50345: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.50401: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 34296 1726855346.50444: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.50470: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.50526: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.50645: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.50704: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 34296 1726855346.50767: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855346.50877: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 34296 1726855346.50883: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8412330> <<< 34296 1726855346.50946: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f840cfb0> <<< 34296 1726855346.50982: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 34296 1726855346.50990: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.51096: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.51177: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.51257: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 34296 1726855346.51286: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 34296 1726855346.51309: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 34296 1726855346.51336: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 34296 1726855346.51414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 34296 1726855346.51450: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 34296 1726855346.51486: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 34296 1726855346.51607: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8502b40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8c2e810> <<< 34296 1726855346.51718: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8412060> <<< 34296 1726855346.51729: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e68230> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 34296 1726855346.51737: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.51776: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.51810: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 34296 1726855346.51818: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 34296 1726855346.51897: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 34296 1726855346.51926: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.51930: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 34296 1726855346.51944: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.52102: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.52440: stdout chunk (state=3): >>># zipimport: zlib available <<< 34296 1726855346.52512: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 34296 1726855346.52530: stdout chunk (state=3): >>># destroy __main__ <<< 34296 1726855346.53013: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib <<< 34296 1726855346.53064: stdout chunk (state=3): >>># cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp <<< 34296 1726855346.53101: stdout chunk (state=3): >>># cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 34296 1726855346.53351: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 34296 1726855346.53373: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 34296 1726855346.53433: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 34296 1726855346.53468: stdout chunk (state=3): >>># destroy ntpath # destroy importlib <<< 34296 1726855346.53532: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 34296 1726855346.53537: stdout chunk (state=3): >>># destroy _locale # destroy pwd <<< 34296 1726855346.53542: stdout chunk (state=3): >>># destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal <<< 34296 1726855346.53545: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid <<< 34296 1726855346.53547: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array # destroy datetime <<< 34296 1726855346.53586: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 34296 1726855346.53594: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 34296 1726855346.53660: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime<<< 34296 1726855346.53668: stdout chunk (state=3): >>> # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 34296 1726855346.53947: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 34296 1726855346.53951: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket <<< 34296 1726855346.53957: stdout chunk (state=3): >>># destroy _collections <<< 34296 1726855346.53984: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 34296 1726855346.53989: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 34296 1726855346.54016: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 34296 1726855346.54058: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator <<< 34296 1726855346.54088: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves <<< 34296 1726855346.54094: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34296 1726855346.54212: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 34296 1726855346.54215: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 34296 1726855346.54251: stdout chunk (state=3): >>># destroy time # destroy _random <<< 34296 1726855346.54257: stdout chunk (state=3): >>># destroy _weakref <<< 34296 1726855346.54283: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator <<< 34296 1726855346.54294: stdout chunk (state=3): >>># destroy _string # destroy re <<< 34296 1726855346.54313: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy _sre <<< 34296 1726855346.54334: stdout chunk (state=3): >>># destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 34296 1726855346.54811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.44 closed. <<< 34296 1726855346.54846: stderr chunk (state=3): >>><<< 34296 1726855346.54849: stdout chunk (state=3): >>><<< 34296 1726855346.54914: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f90184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8fe7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f901aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e2dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e6be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e6bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ea3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ea3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e83b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e81280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e69040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ec37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ec23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e82150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ec0c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ef8860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e682c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8ef8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ef8bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8ef8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e66de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ef9610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ef92e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8efa510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8f10710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8f11df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8f12c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8f132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8f121e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8f13d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8f134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8efa540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8c93bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8cbc6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8cbc410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8cbc6e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8cbd010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8cbda00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8cbc8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8c91d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8cbee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8cbdb50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8efac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ce71a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8d0b530> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8d6c290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8d6e9f0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8d6c3b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8d312e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8b753a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8d0a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8cbfd70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa3f8b75640> # zipimport: found 30 names in '/tmp/ansible_stat_payload_8f0q8g1t/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8bcb080> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ba9f70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8ba9100> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8bc8f20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8bf2a50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8bf27e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8bf20f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8bf2540> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8bcbaa0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8bf37d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8bf39e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8bf3f20> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8515d00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8517920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f85182f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8519490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f851bf50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8f12c00> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f851a210> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8523e30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8522900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8522660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8522bd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f851b440> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f856bef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f856c140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f856dc10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f856d9d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f85701a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f856e300> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8573950> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8570350> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8574740> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8574b90> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8574b30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f856c290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f85fc320> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f85fd430> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8576ab0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8577e60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f85766f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8405670> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f84063f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f85fd6a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8406b10> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f84077a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa3f8412330> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f840cfb0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8502b40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8c2e810> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8412060> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa3f8e68230> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.10.44 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 34296 1726855346.55446: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726855346.1327639-34395-55604447234336/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34296 1726855346.55449: _low_level_execute_command(): starting 34296 1726855346.55451: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726855346.1327639-34395-55604447234336/ > /dev/null 2>&1 && sleep 0' 34296 1726855346.55550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34296 1726855346.55571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34296 1726855346.55575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34296 1726855346.55622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' <<< 34296 1726855346.55635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34296 1726855346.55638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34296 1726855346.55718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34296 1726855346.58047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34296 1726855346.58071: stderr chunk (state=3): >>><<< 34296 1726855346.58074: stdout chunk (state=3): >>><<< 34296 1726855346.58085: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.44 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.44 originally 10.31.10.44 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/74dc155081' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34296 1726855346.58104: handler run complete 34296 1726855346.58118: attempt loop complete, returning result 34296 1726855346.58121: _execute() done 34296 1726855346.58124: dumping result to json 34296 1726855346.58126: done dumping result, returning 34296 1726855346.58136: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0affcc66-ac2b-a97a-1acc-00000000015a] 34296 1726855346.58139: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000015a 34296 1726855346.58227: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000015a 34296 1726855346.58229: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 34296 1726855346.58297: no more pending results, returning what we have 34296 1726855346.58300: results queue empty 34296 1726855346.58301: checking for any_errors_fatal 34296 1726855346.58307: done checking for any_errors_fatal 34296 1726855346.58308: checking for max_fail_percentage 34296 1726855346.58309: done checking for max_fail_percentage 34296 1726855346.58310: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.58311: done checking to see if all hosts have failed 34296 1726855346.58311: getting the remaining hosts for this loop 34296 1726855346.58313: done getting the remaining hosts for this loop 34296 1726855346.58316: getting the next task for host managed_node1 34296 1726855346.58321: done getting next task for host managed_node1 34296 1726855346.58324: ^ task is: TASK: Set flag to indicate system is ostree 34296 1726855346.58326: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.58329: getting variables 34296 1726855346.58330: in VariableManager get_vars() 34296 1726855346.58369: Calling all_inventory to load vars for managed_node1 34296 1726855346.58372: Calling groups_inventory to load vars for managed_node1 34296 1726855346.58376: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.58386: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.58390: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.58393: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.58553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.58671: done with get_vars() 34296 1726855346.58679: done getting variables 34296 1726855346.58753: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 14:02:26 -0400 (0:00:00.506) 0:00:02.625 ****** 34296 1726855346.58774: entering _queue_task() for managed_node1/set_fact 34296 1726855346.58775: Creating lock for set_fact 34296 1726855346.58995: worker is 1 (out of 1 available) 34296 1726855346.59010: exiting _queue_task() for managed_node1/set_fact 34296 1726855346.59021: done queuing things up, now waiting for results queue to drain 34296 1726855346.59023: waiting for pending results... 34296 1726855346.59165: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 34296 1726855346.59227: in run() - task 0affcc66-ac2b-a97a-1acc-00000000015b 34296 1726855346.59236: variable 'ansible_search_path' from source: unknown 34296 1726855346.59240: variable 'ansible_search_path' from source: unknown 34296 1726855346.59275: calling self._execute() 34296 1726855346.59329: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.59333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.59342: variable 'omit' from source: magic vars 34296 1726855346.59745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34296 1726855346.59918: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34296 1726855346.59948: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34296 1726855346.59974: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34296 1726855346.59999: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34296 1726855346.60065: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34296 1726855346.60084: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34296 1726855346.60104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34296 1726855346.60129: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34296 1726855346.60212: Evaluated conditional (not __network_is_ostree is defined): True 34296 1726855346.60215: variable 'omit' from source: magic vars 34296 1726855346.60245: variable 'omit' from source: magic vars 34296 1726855346.60325: variable '__ostree_booted_stat' from source: set_fact 34296 1726855346.60365: variable 'omit' from source: magic vars 34296 1726855346.60385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34296 1726855346.60407: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34296 1726855346.60423: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34296 1726855346.60435: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34296 1726855346.60446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34296 1726855346.60473: variable 'inventory_hostname' from source: host vars for 'managed_node1' 34296 1726855346.60476: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.60478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.60539: Set connection var ansible_shell_type to sh 34296 1726855346.60546: Set connection var ansible_shell_executable to /bin/sh 34296 1726855346.60549: Set connection var ansible_connection to ssh 34296 1726855346.60558: Set connection var ansible_timeout to 10 34296 1726855346.60561: Set connection var ansible_module_compression to ZIP_DEFLATED 34296 1726855346.60569: Set connection var ansible_pipelining to False 34296 1726855346.60589: variable 'ansible_shell_executable' from source: unknown 34296 1726855346.60592: variable 'ansible_connection' from source: unknown 34296 1726855346.60594: variable 'ansible_module_compression' from source: unknown 34296 1726855346.60596: variable 'ansible_shell_type' from source: unknown 34296 1726855346.60599: variable 'ansible_shell_executable' from source: unknown 34296 1726855346.60601: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.60605: variable 'ansible_pipelining' from source: unknown 34296 1726855346.60607: variable 'ansible_timeout' from source: unknown 34296 1726855346.60611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.60683: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34296 1726855346.60693: variable 'omit' from source: magic vars 34296 1726855346.60698: starting attempt loop 34296 1726855346.60701: running the handler 34296 1726855346.60711: handler run complete 34296 1726855346.60718: attempt loop complete, returning result 34296 1726855346.60720: _execute() done 34296 1726855346.60723: dumping result to json 34296 1726855346.60725: done dumping result, returning 34296 1726855346.60732: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0affcc66-ac2b-a97a-1acc-00000000015b] 34296 1726855346.60735: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000015b 34296 1726855346.60810: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000015b 34296 1726855346.60813: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 34296 1726855346.60859: no more pending results, returning what we have 34296 1726855346.60862: results queue empty 34296 1726855346.60864: checking for any_errors_fatal 34296 1726855346.60870: done checking for any_errors_fatal 34296 1726855346.60871: checking for max_fail_percentage 34296 1726855346.60872: done checking for max_fail_percentage 34296 1726855346.60873: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.60874: done checking to see if all hosts have failed 34296 1726855346.60874: getting the remaining hosts for this loop 34296 1726855346.60876: done getting the remaining hosts for this loop 34296 1726855346.60879: getting the next task for host managed_node1 34296 1726855346.60888: done getting next task for host managed_node1 34296 1726855346.60891: ^ task is: TASK: Fix CentOS6 Base repo 34296 1726855346.60893: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.60897: getting variables 34296 1726855346.60898: in VariableManager get_vars() 34296 1726855346.60925: Calling all_inventory to load vars for managed_node1 34296 1726855346.60927: Calling groups_inventory to load vars for managed_node1 34296 1726855346.60930: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.60939: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.60941: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.60949: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.61115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.61230: done with get_vars() 34296 1726855346.61237: done getting variables 34296 1726855346.61323: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 14:02:26 -0400 (0:00:00.025) 0:00:02.651 ****** 34296 1726855346.61342: entering _queue_task() for managed_node1/copy 34296 1726855346.61540: worker is 1 (out of 1 available) 34296 1726855346.61553: exiting _queue_task() for managed_node1/copy 34296 1726855346.61565: done queuing things up, now waiting for results queue to drain 34296 1726855346.61569: waiting for pending results... 34296 1726855346.61709: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 34296 1726855346.61764: in run() - task 0affcc66-ac2b-a97a-1acc-00000000015d 34296 1726855346.61774: variable 'ansible_search_path' from source: unknown 34296 1726855346.61777: variable 'ansible_search_path' from source: unknown 34296 1726855346.61809: calling self._execute() 34296 1726855346.61861: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.61865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.61875: variable 'omit' from source: magic vars 34296 1726855346.62207: variable 'ansible_distribution' from source: facts 34296 1726855346.62225: Evaluated conditional (ansible_distribution == 'CentOS'): True 34296 1726855346.62307: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.62311: Evaluated conditional (ansible_distribution_major_version == '6'): False 34296 1726855346.62314: when evaluation is False, skipping this task 34296 1726855346.62317: _execute() done 34296 1726855346.62319: dumping result to json 34296 1726855346.62324: done dumping result, returning 34296 1726855346.62330: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0affcc66-ac2b-a97a-1acc-00000000015d] 34296 1726855346.62337: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000015d 34296 1726855346.62423: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000015d 34296 1726855346.62426: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 34296 1726855346.62507: no more pending results, returning what we have 34296 1726855346.62510: results queue empty 34296 1726855346.62510: checking for any_errors_fatal 34296 1726855346.62513: done checking for any_errors_fatal 34296 1726855346.62514: checking for max_fail_percentage 34296 1726855346.62515: done checking for max_fail_percentage 34296 1726855346.62516: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.62517: done checking to see if all hosts have failed 34296 1726855346.62517: getting the remaining hosts for this loop 34296 1726855346.62519: done getting the remaining hosts for this loop 34296 1726855346.62521: getting the next task for host managed_node1 34296 1726855346.62526: done getting next task for host managed_node1 34296 1726855346.62528: ^ task is: TASK: Include the task 'enable_epel.yml' 34296 1726855346.62530: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.62533: getting variables 34296 1726855346.62534: in VariableManager get_vars() 34296 1726855346.62555: Calling all_inventory to load vars for managed_node1 34296 1726855346.62557: Calling groups_inventory to load vars for managed_node1 34296 1726855346.62560: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.62570: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.62573: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.62576: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.62682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.62799: done with get_vars() 34296 1726855346.62805: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 14:02:26 -0400 (0:00:00.015) 0:00:02.666 ****** 34296 1726855346.62864: entering _queue_task() for managed_node1/include_tasks 34296 1726855346.63055: worker is 1 (out of 1 available) 34296 1726855346.63071: exiting _queue_task() for managed_node1/include_tasks 34296 1726855346.63082: done queuing things up, now waiting for results queue to drain 34296 1726855346.63083: waiting for pending results... 34296 1726855346.63223: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 34296 1726855346.63277: in run() - task 0affcc66-ac2b-a97a-1acc-00000000015e 34296 1726855346.63286: variable 'ansible_search_path' from source: unknown 34296 1726855346.63292: variable 'ansible_search_path' from source: unknown 34296 1726855346.63321: calling self._execute() 34296 1726855346.63428: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.63433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.63443: variable 'omit' from source: magic vars 34296 1726855346.63781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34296 1726855346.65372: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34296 1726855346.65428: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34296 1726855346.65468: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34296 1726855346.65500: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34296 1726855346.65516: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34296 1726855346.65575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34296 1726855346.65599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34296 1726855346.65620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34296 1726855346.65645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34296 1726855346.65656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34296 1726855346.65743: variable '__network_is_ostree' from source: set_fact 34296 1726855346.65757: Evaluated conditional (not __network_is_ostree | d(false)): True 34296 1726855346.65763: _execute() done 34296 1726855346.65768: dumping result to json 34296 1726855346.65771: done dumping result, returning 34296 1726855346.65775: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0affcc66-ac2b-a97a-1acc-00000000015e] 34296 1726855346.65779: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000015e 34296 1726855346.65864: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000015e 34296 1726855346.65870: WORKER PROCESS EXITING 34296 1726855346.65901: no more pending results, returning what we have 34296 1726855346.65905: in VariableManager get_vars() 34296 1726855346.65937: Calling all_inventory to load vars for managed_node1 34296 1726855346.65940: Calling groups_inventory to load vars for managed_node1 34296 1726855346.65999: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.66009: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.66011: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.66014: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.66145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.66260: done with get_vars() 34296 1726855346.66266: variable 'ansible_search_path' from source: unknown 34296 1726855346.66267: variable 'ansible_search_path' from source: unknown 34296 1726855346.66311: we have included files to process 34296 1726855346.66313: generating all_blocks data 34296 1726855346.66314: done generating all_blocks data 34296 1726855346.66317: processing included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34296 1726855346.66318: loading included file: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34296 1726855346.66320: Loading data from /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34296 1726855346.66914: done processing included file 34296 1726855346.66915: iterating over new_blocks loaded from include file 34296 1726855346.66916: in VariableManager get_vars() 34296 1726855346.66924: done with get_vars() 34296 1726855346.66925: filtering new block on tags 34296 1726855346.66938: done filtering new block on tags 34296 1726855346.66940: in VariableManager get_vars() 34296 1726855346.66946: done with get_vars() 34296 1726855346.66947: filtering new block on tags 34296 1726855346.66953: done filtering new block on tags 34296 1726855346.66954: done iterating over new_blocks loaded from include file included: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 34296 1726855346.66958: extending task lists for all hosts with included blocks 34296 1726855346.67020: done extending task lists 34296 1726855346.67021: done processing included files 34296 1726855346.67022: results queue empty 34296 1726855346.67022: checking for any_errors_fatal 34296 1726855346.67024: done checking for any_errors_fatal 34296 1726855346.67025: checking for max_fail_percentage 34296 1726855346.67025: done checking for max_fail_percentage 34296 1726855346.67026: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.67026: done checking to see if all hosts have failed 34296 1726855346.67027: getting the remaining hosts for this loop 34296 1726855346.67027: done getting the remaining hosts for this loop 34296 1726855346.67029: getting the next task for host managed_node1 34296 1726855346.67032: done getting next task for host managed_node1 34296 1726855346.67033: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 34296 1726855346.67035: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.67036: getting variables 34296 1726855346.67037: in VariableManager get_vars() 34296 1726855346.67042: Calling all_inventory to load vars for managed_node1 34296 1726855346.67043: Calling groups_inventory to load vars for managed_node1 34296 1726855346.67045: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.67049: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.67055: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.67057: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.67138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.67250: done with get_vars() 34296 1726855346.67256: done getting variables 34296 1726855346.67306: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 34296 1726855346.67437: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 14:02:26 -0400 (0:00:00.046) 0:00:02.712 ****** 34296 1726855346.67470: entering _queue_task() for managed_node1/command 34296 1726855346.67471: Creating lock for command 34296 1726855346.67707: worker is 1 (out of 1 available) 34296 1726855346.67721: exiting _queue_task() for managed_node1/command 34296 1726855346.67733: done queuing things up, now waiting for results queue to drain 34296 1726855346.67734: waiting for pending results... 34296 1726855346.67894: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 34296 1726855346.67966: in run() - task 0affcc66-ac2b-a97a-1acc-000000000178 34296 1726855346.67975: variable 'ansible_search_path' from source: unknown 34296 1726855346.67979: variable 'ansible_search_path' from source: unknown 34296 1726855346.68007: calling self._execute() 34296 1726855346.68063: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.68072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.68084: variable 'omit' from source: magic vars 34296 1726855346.68399: variable 'ansible_distribution' from source: facts 34296 1726855346.68411: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34296 1726855346.68493: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.68502: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34296 1726855346.68506: when evaluation is False, skipping this task 34296 1726855346.68510: _execute() done 34296 1726855346.68513: dumping result to json 34296 1726855346.68516: done dumping result, returning 34296 1726855346.68518: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [0affcc66-ac2b-a97a-1acc-000000000178] 34296 1726855346.68522: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000178 34296 1726855346.68616: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000178 34296 1726855346.68618: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34296 1726855346.68684: no more pending results, returning what we have 34296 1726855346.68689: results queue empty 34296 1726855346.68690: checking for any_errors_fatal 34296 1726855346.68692: done checking for any_errors_fatal 34296 1726855346.68692: checking for max_fail_percentage 34296 1726855346.68694: done checking for max_fail_percentage 34296 1726855346.68694: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.68695: done checking to see if all hosts have failed 34296 1726855346.68696: getting the remaining hosts for this loop 34296 1726855346.68697: done getting the remaining hosts for this loop 34296 1726855346.68700: getting the next task for host managed_node1 34296 1726855346.68706: done getting next task for host managed_node1 34296 1726855346.68708: ^ task is: TASK: Install yum-utils package 34296 1726855346.68711: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.68715: getting variables 34296 1726855346.68716: in VariableManager get_vars() 34296 1726855346.68778: Calling all_inventory to load vars for managed_node1 34296 1726855346.68781: Calling groups_inventory to load vars for managed_node1 34296 1726855346.68783: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.68801: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.68803: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.68806: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.68920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.69041: done with get_vars() 34296 1726855346.69048: done getting variables 34296 1726855346.69123: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 14:02:26 -0400 (0:00:00.016) 0:00:02.729 ****** 34296 1726855346.69143: entering _queue_task() for managed_node1/package 34296 1726855346.69144: Creating lock for package 34296 1726855346.69349: worker is 1 (out of 1 available) 34296 1726855346.69362: exiting _queue_task() for managed_node1/package 34296 1726855346.69376: done queuing things up, now waiting for results queue to drain 34296 1726855346.69377: waiting for pending results... 34296 1726855346.69518: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 34296 1726855346.69591: in run() - task 0affcc66-ac2b-a97a-1acc-000000000179 34296 1726855346.69608: variable 'ansible_search_path' from source: unknown 34296 1726855346.69611: variable 'ansible_search_path' from source: unknown 34296 1726855346.69634: calling self._execute() 34296 1726855346.69689: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.69693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.69701: variable 'omit' from source: magic vars 34296 1726855346.70041: variable 'ansible_distribution' from source: facts 34296 1726855346.70047: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34296 1726855346.70071: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.70075: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34296 1726855346.70078: when evaluation is False, skipping this task 34296 1726855346.70081: _execute() done 34296 1726855346.70083: dumping result to json 34296 1726855346.70086: done dumping result, returning 34296 1726855346.70095: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0affcc66-ac2b-a97a-1acc-000000000179] 34296 1726855346.70098: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000179 34296 1726855346.70183: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000179 34296 1726855346.70186: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34296 1726855346.70232: no more pending results, returning what we have 34296 1726855346.70235: results queue empty 34296 1726855346.70236: checking for any_errors_fatal 34296 1726855346.70244: done checking for any_errors_fatal 34296 1726855346.70245: checking for max_fail_percentage 34296 1726855346.70246: done checking for max_fail_percentage 34296 1726855346.70247: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.70248: done checking to see if all hosts have failed 34296 1726855346.70249: getting the remaining hosts for this loop 34296 1726855346.70250: done getting the remaining hosts for this loop 34296 1726855346.70253: getting the next task for host managed_node1 34296 1726855346.70259: done getting next task for host managed_node1 34296 1726855346.70261: ^ task is: TASK: Enable EPEL 7 34296 1726855346.70264: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.70269: getting variables 34296 1726855346.70271: in VariableManager get_vars() 34296 1726855346.70294: Calling all_inventory to load vars for managed_node1 34296 1726855346.70298: Calling groups_inventory to load vars for managed_node1 34296 1726855346.70301: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.70309: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.70311: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.70313: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.70435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.70569: done with get_vars() 34296 1726855346.70578: done getting variables 34296 1726855346.70619: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 14:02:26 -0400 (0:00:00.014) 0:00:02.744 ****** 34296 1726855346.70639: entering _queue_task() for managed_node1/command 34296 1726855346.70823: worker is 1 (out of 1 available) 34296 1726855346.70838: exiting _queue_task() for managed_node1/command 34296 1726855346.70849: done queuing things up, now waiting for results queue to drain 34296 1726855346.70850: waiting for pending results... 34296 1726855346.70996: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 34296 1726855346.71061: in run() - task 0affcc66-ac2b-a97a-1acc-00000000017a 34296 1726855346.71073: variable 'ansible_search_path' from source: unknown 34296 1726855346.71078: variable 'ansible_search_path' from source: unknown 34296 1726855346.71106: calling self._execute() 34296 1726855346.71158: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.71162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.71173: variable 'omit' from source: magic vars 34296 1726855346.71445: variable 'ansible_distribution' from source: facts 34296 1726855346.71456: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34296 1726855346.71545: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.71549: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34296 1726855346.71552: when evaluation is False, skipping this task 34296 1726855346.71555: _execute() done 34296 1726855346.71558: dumping result to json 34296 1726855346.71560: done dumping result, returning 34296 1726855346.71568: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0affcc66-ac2b-a97a-1acc-00000000017a] 34296 1726855346.71575: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000017a 34296 1726855346.71655: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000017a 34296 1726855346.71657: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34296 1726855346.71701: no more pending results, returning what we have 34296 1726855346.71705: results queue empty 34296 1726855346.71706: checking for any_errors_fatal 34296 1726855346.71712: done checking for any_errors_fatal 34296 1726855346.71713: checking for max_fail_percentage 34296 1726855346.71714: done checking for max_fail_percentage 34296 1726855346.71715: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.71715: done checking to see if all hosts have failed 34296 1726855346.71716: getting the remaining hosts for this loop 34296 1726855346.71718: done getting the remaining hosts for this loop 34296 1726855346.71721: getting the next task for host managed_node1 34296 1726855346.71726: done getting next task for host managed_node1 34296 1726855346.71728: ^ task is: TASK: Enable EPEL 8 34296 1726855346.71732: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.71735: getting variables 34296 1726855346.71736: in VariableManager get_vars() 34296 1726855346.71758: Calling all_inventory to load vars for managed_node1 34296 1726855346.71760: Calling groups_inventory to load vars for managed_node1 34296 1726855346.71763: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.71772: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.71774: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.71776: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.71902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.72019: done with get_vars() 34296 1726855346.72026: done getting variables 34296 1726855346.72067: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 14:02:26 -0400 (0:00:00.014) 0:00:02.758 ****** 34296 1726855346.72090: entering _queue_task() for managed_node1/command 34296 1726855346.72279: worker is 1 (out of 1 available) 34296 1726855346.72293: exiting _queue_task() for managed_node1/command 34296 1726855346.72306: done queuing things up, now waiting for results queue to drain 34296 1726855346.72307: waiting for pending results... 34296 1726855346.72452: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 34296 1726855346.72518: in run() - task 0affcc66-ac2b-a97a-1acc-00000000017b 34296 1726855346.72531: variable 'ansible_search_path' from source: unknown 34296 1726855346.72534: variable 'ansible_search_path' from source: unknown 34296 1726855346.72559: calling self._execute() 34296 1726855346.72618: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.72622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.72631: variable 'omit' from source: magic vars 34296 1726855346.72906: variable 'ansible_distribution' from source: facts 34296 1726855346.72916: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34296 1726855346.73003: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.73008: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34296 1726855346.73011: when evaluation is False, skipping this task 34296 1726855346.73014: _execute() done 34296 1726855346.73016: dumping result to json 34296 1726855346.73018: done dumping result, returning 34296 1726855346.73027: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0affcc66-ac2b-a97a-1acc-00000000017b] 34296 1726855346.73030: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000017b 34296 1726855346.73121: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000017b 34296 1726855346.73124: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34296 1726855346.73169: no more pending results, returning what we have 34296 1726855346.73172: results queue empty 34296 1726855346.73173: checking for any_errors_fatal 34296 1726855346.73178: done checking for any_errors_fatal 34296 1726855346.73179: checking for max_fail_percentage 34296 1726855346.73181: done checking for max_fail_percentage 34296 1726855346.73181: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.73182: done checking to see if all hosts have failed 34296 1726855346.73183: getting the remaining hosts for this loop 34296 1726855346.73184: done getting the remaining hosts for this loop 34296 1726855346.73189: getting the next task for host managed_node1 34296 1726855346.73197: done getting next task for host managed_node1 34296 1726855346.73200: ^ task is: TASK: Enable EPEL 6 34296 1726855346.73203: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.73207: getting variables 34296 1726855346.73208: in VariableManager get_vars() 34296 1726855346.73231: Calling all_inventory to load vars for managed_node1 34296 1726855346.73233: Calling groups_inventory to load vars for managed_node1 34296 1726855346.73235: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.73244: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.73246: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.73248: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.73417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.73532: done with get_vars() 34296 1726855346.73539: done getting variables 34296 1726855346.73580: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 14:02:26 -0400 (0:00:00.015) 0:00:02.773 ****** 34296 1726855346.73602: entering _queue_task() for managed_node1/copy 34296 1726855346.73794: worker is 1 (out of 1 available) 34296 1726855346.73808: exiting _queue_task() for managed_node1/copy 34296 1726855346.73817: done queuing things up, now waiting for results queue to drain 34296 1726855346.73818: waiting for pending results... 34296 1726855346.73968: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 34296 1726855346.74042: in run() - task 0affcc66-ac2b-a97a-1acc-00000000017d 34296 1726855346.74062: variable 'ansible_search_path' from source: unknown 34296 1726855346.74065: variable 'ansible_search_path' from source: unknown 34296 1726855346.74090: calling self._execute() 34296 1726855346.74140: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.74144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.74155: variable 'omit' from source: magic vars 34296 1726855346.74429: variable 'ansible_distribution' from source: facts 34296 1726855346.74439: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34296 1726855346.74592: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.74596: Evaluated conditional (ansible_distribution_major_version == '6'): False 34296 1726855346.74598: when evaluation is False, skipping this task 34296 1726855346.74601: _execute() done 34296 1726855346.74603: dumping result to json 34296 1726855346.74609: done dumping result, returning 34296 1726855346.74612: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0affcc66-ac2b-a97a-1acc-00000000017d] 34296 1726855346.74614: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000017d skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 34296 1726855346.74780: no more pending results, returning what we have 34296 1726855346.74784: results queue empty 34296 1726855346.74785: checking for any_errors_fatal 34296 1726855346.74793: done checking for any_errors_fatal 34296 1726855346.74795: checking for max_fail_percentage 34296 1726855346.74796: done checking for max_fail_percentage 34296 1726855346.74797: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.74798: done checking to see if all hosts have failed 34296 1726855346.74799: getting the remaining hosts for this loop 34296 1726855346.74800: done getting the remaining hosts for this loop 34296 1726855346.74804: getting the next task for host managed_node1 34296 1726855346.74813: done getting next task for host managed_node1 34296 1726855346.74816: ^ task is: TASK: Set network provider to 'nm' 34296 1726855346.74821: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.74825: getting variables 34296 1726855346.74826: in VariableManager get_vars() 34296 1726855346.74854: Calling all_inventory to load vars for managed_node1 34296 1726855346.74857: Calling groups_inventory to load vars for managed_node1 34296 1726855346.74861: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.74891: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000017d 34296 1726855346.74894: WORKER PROCESS EXITING 34296 1726855346.74904: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.74906: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.74909: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.75076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.75278: done with get_vars() 34296 1726855346.75286: done getting variables 34296 1726855346.75351: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:13 Friday 20 September 2024 14:02:26 -0400 (0:00:00.017) 0:00:02.791 ****** 34296 1726855346.75377: entering _queue_task() for managed_node1/set_fact 34296 1726855346.75644: worker is 1 (out of 1 available) 34296 1726855346.75659: exiting _queue_task() for managed_node1/set_fact 34296 1726855346.75670: done queuing things up, now waiting for results queue to drain 34296 1726855346.75671: waiting for pending results... 34296 1726855346.75817: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 34296 1726855346.75885: in run() - task 0affcc66-ac2b-a97a-1acc-000000000007 34296 1726855346.75898: variable 'ansible_search_path' from source: unknown 34296 1726855346.75926: calling self._execute() 34296 1726855346.75984: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.75989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.75999: variable 'omit' from source: magic vars 34296 1726855346.76074: variable 'omit' from source: magic vars 34296 1726855346.76098: variable 'omit' from source: magic vars 34296 1726855346.76125: variable 'omit' from source: magic vars 34296 1726855346.76158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34296 1726855346.76192: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34296 1726855346.76209: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34296 1726855346.76222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34296 1726855346.76232: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34296 1726855346.76256: variable 'inventory_hostname' from source: host vars for 'managed_node1' 34296 1726855346.76259: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.76262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.76334: Set connection var ansible_shell_type to sh 34296 1726855346.76340: Set connection var ansible_shell_executable to /bin/sh 34296 1726855346.76343: Set connection var ansible_connection to ssh 34296 1726855346.76350: Set connection var ansible_timeout to 10 34296 1726855346.76355: Set connection var ansible_module_compression to ZIP_DEFLATED 34296 1726855346.76360: Set connection var ansible_pipelining to False 34296 1726855346.76383: variable 'ansible_shell_executable' from source: unknown 34296 1726855346.76386: variable 'ansible_connection' from source: unknown 34296 1726855346.76390: variable 'ansible_module_compression' from source: unknown 34296 1726855346.76392: variable 'ansible_shell_type' from source: unknown 34296 1726855346.76396: variable 'ansible_shell_executable' from source: unknown 34296 1726855346.76399: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.76401: variable 'ansible_pipelining' from source: unknown 34296 1726855346.76403: variable 'ansible_timeout' from source: unknown 34296 1726855346.76413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.76513: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34296 1726855346.76526: variable 'omit' from source: magic vars 34296 1726855346.76529: starting attempt loop 34296 1726855346.76532: running the handler 34296 1726855346.76542: handler run complete 34296 1726855346.76549: attempt loop complete, returning result 34296 1726855346.76551: _execute() done 34296 1726855346.76554: dumping result to json 34296 1726855346.76556: done dumping result, returning 34296 1726855346.76563: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [0affcc66-ac2b-a97a-1acc-000000000007] 34296 1726855346.76569: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000007 34296 1726855346.76645: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000007 34296 1726855346.76647: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 34296 1726855346.76708: no more pending results, returning what we have 34296 1726855346.76711: results queue empty 34296 1726855346.76712: checking for any_errors_fatal 34296 1726855346.76720: done checking for any_errors_fatal 34296 1726855346.76720: checking for max_fail_percentage 34296 1726855346.76722: done checking for max_fail_percentage 34296 1726855346.76722: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.76723: done checking to see if all hosts have failed 34296 1726855346.76724: getting the remaining hosts for this loop 34296 1726855346.76725: done getting the remaining hosts for this loop 34296 1726855346.76728: getting the next task for host managed_node1 34296 1726855346.76733: done getting next task for host managed_node1 34296 1726855346.76735: ^ task is: TASK: meta (flush_handlers) 34296 1726855346.76737: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.76741: getting variables 34296 1726855346.76742: in VariableManager get_vars() 34296 1726855346.76767: Calling all_inventory to load vars for managed_node1 34296 1726855346.76770: Calling groups_inventory to load vars for managed_node1 34296 1726855346.76772: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.76781: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.76782: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.76785: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.76938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.77051: done with get_vars() 34296 1726855346.77061: done getting variables 34296 1726855346.77109: in VariableManager get_vars() 34296 1726855346.77116: Calling all_inventory to load vars for managed_node1 34296 1726855346.77117: Calling groups_inventory to load vars for managed_node1 34296 1726855346.77118: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.77121: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.77122: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.77124: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.77206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.77312: done with get_vars() 34296 1726855346.77321: done queuing things up, now waiting for results queue to drain 34296 1726855346.77322: results queue empty 34296 1726855346.77323: checking for any_errors_fatal 34296 1726855346.77324: done checking for any_errors_fatal 34296 1726855346.77325: checking for max_fail_percentage 34296 1726855346.77325: done checking for max_fail_percentage 34296 1726855346.77326: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.77326: done checking to see if all hosts have failed 34296 1726855346.77327: getting the remaining hosts for this loop 34296 1726855346.77327: done getting the remaining hosts for this loop 34296 1726855346.77329: getting the next task for host managed_node1 34296 1726855346.77331: done getting next task for host managed_node1 34296 1726855346.77332: ^ task is: TASK: meta (flush_handlers) 34296 1726855346.77333: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.77339: getting variables 34296 1726855346.77339: in VariableManager get_vars() 34296 1726855346.77344: Calling all_inventory to load vars for managed_node1 34296 1726855346.77345: Calling groups_inventory to load vars for managed_node1 34296 1726855346.77347: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.77350: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.77351: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.77353: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.77553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.77745: done with get_vars() 34296 1726855346.77752: done getting variables 34296 1726855346.77799: in VariableManager get_vars() 34296 1726855346.77808: Calling all_inventory to load vars for managed_node1 34296 1726855346.77810: Calling groups_inventory to load vars for managed_node1 34296 1726855346.77812: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.77816: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.77818: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.77820: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.77931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.78302: done with get_vars() 34296 1726855346.78313: done queuing things up, now waiting for results queue to drain 34296 1726855346.78315: results queue empty 34296 1726855346.78316: checking for any_errors_fatal 34296 1726855346.78317: done checking for any_errors_fatal 34296 1726855346.78318: checking for max_fail_percentage 34296 1726855346.78319: done checking for max_fail_percentage 34296 1726855346.78319: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.78320: done checking to see if all hosts have failed 34296 1726855346.78321: getting the remaining hosts for this loop 34296 1726855346.78321: done getting the remaining hosts for this loop 34296 1726855346.78324: getting the next task for host managed_node1 34296 1726855346.78326: done getting next task for host managed_node1 34296 1726855346.78327: ^ task is: None 34296 1726855346.78328: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.78329: done queuing things up, now waiting for results queue to drain 34296 1726855346.78330: results queue empty 34296 1726855346.78331: checking for any_errors_fatal 34296 1726855346.78332: done checking for any_errors_fatal 34296 1726855346.78333: checking for max_fail_percentage 34296 1726855346.78333: done checking for max_fail_percentage 34296 1726855346.78334: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.78335: done checking to see if all hosts have failed 34296 1726855346.78336: getting the next task for host managed_node1 34296 1726855346.78338: done getting next task for host managed_node1 34296 1726855346.78339: ^ task is: None 34296 1726855346.78340: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.78384: in VariableManager get_vars() 34296 1726855346.78415: done with get_vars() 34296 1726855346.78421: in VariableManager get_vars() 34296 1726855346.78442: done with get_vars() 34296 1726855346.78448: variable 'omit' from source: magic vars 34296 1726855346.78486: in VariableManager get_vars() 34296 1726855346.78510: done with get_vars() 34296 1726855346.78531: variable 'omit' from source: magic vars PLAY [Play for testing wireless connection] ************************************ 34296 1726855346.79598: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 34296 1726855346.79627: getting the remaining hosts for this loop 34296 1726855346.79628: done getting the remaining hosts for this loop 34296 1726855346.79631: getting the next task for host managed_node1 34296 1726855346.79634: done getting next task for host managed_node1 34296 1726855346.79636: ^ task is: TASK: Gathering Facts 34296 1726855346.79637: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.79639: getting variables 34296 1726855346.79640: in VariableManager get_vars() 34296 1726855346.79657: Calling all_inventory to load vars for managed_node1 34296 1726855346.79659: Calling groups_inventory to load vars for managed_node1 34296 1726855346.79661: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.79669: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.79682: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.79686: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.80029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.80441: done with get_vars() 34296 1726855346.80450: done getting variables 34296 1726855346.80696: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:3 Friday 20 September 2024 14:02:26 -0400 (0:00:00.053) 0:00:02.844 ****** 34296 1726855346.80721: entering _queue_task() for managed_node1/gather_facts 34296 1726855346.81050: worker is 1 (out of 1 available) 34296 1726855346.81072: exiting _queue_task() for managed_node1/gather_facts 34296 1726855346.81084: done queuing things up, now waiting for results queue to drain 34296 1726855346.81086: waiting for pending results... 34296 1726855346.81339: running TaskExecutor() for managed_node1/TASK: Gathering Facts 34296 1726855346.81438: in run() - task 0affcc66-ac2b-a97a-1acc-0000000001a3 34296 1726855346.81463: variable 'ansible_search_path' from source: unknown 34296 1726855346.81508: calling self._execute() 34296 1726855346.81602: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.81612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.81624: variable 'omit' from source: magic vars 34296 1726855346.81973: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.81986: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855346.82071: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.82075: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855346.82078: when evaluation is False, skipping this task 34296 1726855346.82081: _execute() done 34296 1726855346.82084: dumping result to json 34296 1726855346.82089: done dumping result, returning 34296 1726855346.82096: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affcc66-ac2b-a97a-1acc-0000000001a3] 34296 1726855346.82099: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000001a3 34296 1726855346.82167: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000001a3 34296 1726855346.82180: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855346.82258: no more pending results, returning what we have 34296 1726855346.82262: results queue empty 34296 1726855346.82262: checking for any_errors_fatal 34296 1726855346.82264: done checking for any_errors_fatal 34296 1726855346.82264: checking for max_fail_percentage 34296 1726855346.82265: done checking for max_fail_percentage 34296 1726855346.82266: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.82267: done checking to see if all hosts have failed 34296 1726855346.82268: getting the remaining hosts for this loop 34296 1726855346.82269: done getting the remaining hosts for this loop 34296 1726855346.82272: getting the next task for host managed_node1 34296 1726855346.82286: done getting next task for host managed_node1 34296 1726855346.82289: ^ task is: TASK: meta (flush_handlers) 34296 1726855346.82292: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.82296: getting variables 34296 1726855346.82297: in VariableManager get_vars() 34296 1726855346.82332: Calling all_inventory to load vars for managed_node1 34296 1726855346.82335: Calling groups_inventory to load vars for managed_node1 34296 1726855346.82337: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.82345: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.82348: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.82350: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.82464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.82593: done with get_vars() 34296 1726855346.82600: done getting variables 34296 1726855346.82678: in VariableManager get_vars() 34296 1726855346.82697: Calling all_inventory to load vars for managed_node1 34296 1726855346.82699: Calling groups_inventory to load vars for managed_node1 34296 1726855346.82701: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.82705: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.82707: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.82710: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.82869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.83058: done with get_vars() 34296 1726855346.83072: done queuing things up, now waiting for results queue to drain 34296 1726855346.83074: results queue empty 34296 1726855346.83075: checking for any_errors_fatal 34296 1726855346.83077: done checking for any_errors_fatal 34296 1726855346.83077: checking for max_fail_percentage 34296 1726855346.83078: done checking for max_fail_percentage 34296 1726855346.83079: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.83080: done checking to see if all hosts have failed 34296 1726855346.83081: getting the remaining hosts for this loop 34296 1726855346.83081: done getting the remaining hosts for this loop 34296 1726855346.83084: getting the next task for host managed_node1 34296 1726855346.83091: done getting next task for host managed_node1 34296 1726855346.83093: ^ task is: TASK: INIT: wireless tests 34296 1726855346.83095: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.83096: getting variables 34296 1726855346.83097: in VariableManager get_vars() 34296 1726855346.83112: Calling all_inventory to load vars for managed_node1 34296 1726855346.83114: Calling groups_inventory to load vars for managed_node1 34296 1726855346.83116: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.83121: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.83123: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.83125: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.83260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.83455: done with get_vars() 34296 1726855346.83462: done getting variables 34296 1726855346.83542: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT: wireless tests] **************************************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:8 Friday 20 September 2024 14:02:26 -0400 (0:00:00.028) 0:00:02.873 ****** 34296 1726855346.83568: entering _queue_task() for managed_node1/debug 34296 1726855346.83570: Creating lock for debug 34296 1726855346.83919: worker is 1 (out of 1 available) 34296 1726855346.83932: exiting _queue_task() for managed_node1/debug 34296 1726855346.83943: done queuing things up, now waiting for results queue to drain 34296 1726855346.83944: waiting for pending results... 34296 1726855346.84404: running TaskExecutor() for managed_node1/TASK: INIT: wireless tests 34296 1726855346.84409: in run() - task 0affcc66-ac2b-a97a-1acc-00000000000b 34296 1726855346.84412: variable 'ansible_search_path' from source: unknown 34296 1726855346.84416: calling self._execute() 34296 1726855346.84435: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.84446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.84461: variable 'omit' from source: magic vars 34296 1726855346.84831: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.84850: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855346.84974: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.84985: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855346.84995: when evaluation is False, skipping this task 34296 1726855346.85001: _execute() done 34296 1726855346.85007: dumping result to json 34296 1726855346.85074: done dumping result, returning 34296 1726855346.85078: done running TaskExecutor() for managed_node1/TASK: INIT: wireless tests [0affcc66-ac2b-a97a-1acc-00000000000b] 34296 1726855346.85080: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000000b 34296 1726855346.85147: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000000b 34296 1726855346.85150: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855346.85226: no more pending results, returning what we have 34296 1726855346.85229: results queue empty 34296 1726855346.85230: checking for any_errors_fatal 34296 1726855346.85233: done checking for any_errors_fatal 34296 1726855346.85234: checking for max_fail_percentage 34296 1726855346.85235: done checking for max_fail_percentage 34296 1726855346.85236: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.85237: done checking to see if all hosts have failed 34296 1726855346.85237: getting the remaining hosts for this loop 34296 1726855346.85238: done getting the remaining hosts for this loop 34296 1726855346.85242: getting the next task for host managed_node1 34296 1726855346.85248: done getting next task for host managed_node1 34296 1726855346.85251: ^ task is: TASK: Include the task 'setup_mock_wifi.yml' 34296 1726855346.85254: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.85257: getting variables 34296 1726855346.85259: in VariableManager get_vars() 34296 1726855346.85311: Calling all_inventory to load vars for managed_node1 34296 1726855346.85315: Calling groups_inventory to load vars for managed_node1 34296 1726855346.85318: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.85330: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.85333: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.85336: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.85817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.86001: done with get_vars() 34296 1726855346.86012: done getting variables TASK [Include the task 'setup_mock_wifi.yml'] ********************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:11 Friday 20 September 2024 14:02:26 -0400 (0:00:00.025) 0:00:02.898 ****** 34296 1726855346.86104: entering _queue_task() for managed_node1/include_tasks 34296 1726855346.86364: worker is 1 (out of 1 available) 34296 1726855346.86378: exiting _queue_task() for managed_node1/include_tasks 34296 1726855346.86390: done queuing things up, now waiting for results queue to drain 34296 1726855346.86391: waiting for pending results... 34296 1726855346.86805: running TaskExecutor() for managed_node1/TASK: Include the task 'setup_mock_wifi.yml' 34296 1726855346.86809: in run() - task 0affcc66-ac2b-a97a-1acc-00000000000c 34296 1726855346.86812: variable 'ansible_search_path' from source: unknown 34296 1726855346.86814: calling self._execute() 34296 1726855346.86861: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.86873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.86884: variable 'omit' from source: magic vars 34296 1726855346.87343: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.87369: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855346.87498: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.87508: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855346.87515: when evaluation is False, skipping this task 34296 1726855346.87521: _execute() done 34296 1726855346.87526: dumping result to json 34296 1726855346.87533: done dumping result, returning 34296 1726855346.87542: done running TaskExecutor() for managed_node1/TASK: Include the task 'setup_mock_wifi.yml' [0affcc66-ac2b-a97a-1acc-00000000000c] 34296 1726855346.87550: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000000c skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855346.87737: no more pending results, returning what we have 34296 1726855346.87741: results queue empty 34296 1726855346.87741: checking for any_errors_fatal 34296 1726855346.87751: done checking for any_errors_fatal 34296 1726855346.87752: checking for max_fail_percentage 34296 1726855346.87754: done checking for max_fail_percentage 34296 1726855346.87755: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.87755: done checking to see if all hosts have failed 34296 1726855346.87756: getting the remaining hosts for this loop 34296 1726855346.87758: done getting the remaining hosts for this loop 34296 1726855346.87762: getting the next task for host managed_node1 34296 1726855346.87770: done getting next task for host managed_node1 34296 1726855346.87772: ^ task is: TASK: Copy client certs 34296 1726855346.87775: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.87778: getting variables 34296 1726855346.87780: in VariableManager get_vars() 34296 1726855346.87833: Calling all_inventory to load vars for managed_node1 34296 1726855346.87836: Calling groups_inventory to load vars for managed_node1 34296 1726855346.87839: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.87853: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.87857: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.87860: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.88327: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000000c 34296 1726855346.88331: WORKER PROCESS EXITING 34296 1726855346.88356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.88561: done with get_vars() 34296 1726855346.88574: done getting variables 34296 1726855346.88631: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Copy client certs] ******************************************************* task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:13 Friday 20 September 2024 14:02:26 -0400 (0:00:00.025) 0:00:02.924 ****** 34296 1726855346.88658: entering _queue_task() for managed_node1/copy 34296 1726855346.88927: worker is 1 (out of 1 available) 34296 1726855346.88942: exiting _queue_task() for managed_node1/copy 34296 1726855346.88955: done queuing things up, now waiting for results queue to drain 34296 1726855346.88956: waiting for pending results... 34296 1726855346.89217: running TaskExecutor() for managed_node1/TASK: Copy client certs 34296 1726855346.89311: in run() - task 0affcc66-ac2b-a97a-1acc-00000000000d 34296 1726855346.89332: variable 'ansible_search_path' from source: unknown 34296 1726855346.89570: Loaded config def from plugin (lookup/items) 34296 1726855346.89583: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 34296 1726855346.89642: variable 'omit' from source: magic vars 34296 1726855346.89765: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.89782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.89799: variable 'omit' from source: magic vars 34296 1726855346.90131: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.90146: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855346.90269: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.90284: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855346.90294: when evaluation is False, skipping this task 34296 1726855346.90323: variable 'item' from source: unknown 34296 1726855346.90398: variable 'item' from source: unknown skipping: [managed_node1] => (item=client.key) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "client.key", "skip_reason": "Conditional result was False" } 34296 1726855346.90694: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.90698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.90700: variable 'omit' from source: magic vars 34296 1726855346.90771: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.90782: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855346.90897: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.90911: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855346.90917: when evaluation is False, skipping this task 34296 1726855346.90944: variable 'item' from source: unknown 34296 1726855346.91005: variable 'item' from source: unknown skipping: [managed_node1] => (item=client.pem) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "client.pem", "skip_reason": "Conditional result was False" } 34296 1726855346.91235: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.91238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.91240: variable 'omit' from source: magic vars 34296 1726855346.91315: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.91325: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855346.91433: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.91450: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855346.91454: when evaluation is False, skipping this task 34296 1726855346.91560: variable 'item' from source: unknown 34296 1726855346.91563: variable 'item' from source: unknown skipping: [managed_node1] => (item=cacert.pem) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "cacert.pem", "skip_reason": "Conditional result was False" } 34296 1726855346.91624: dumping result to json 34296 1726855346.91627: done dumping result, returning 34296 1726855346.91636: done running TaskExecutor() for managed_node1/TASK: Copy client certs [0affcc66-ac2b-a97a-1acc-00000000000d] 34296 1726855346.91670: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000000d skipping: [managed_node1] => { "changed": false } MSG: All items skipped 34296 1726855346.91813: no more pending results, returning what we have 34296 1726855346.91816: results queue empty 34296 1726855346.91817: checking for any_errors_fatal 34296 1726855346.91824: done checking for any_errors_fatal 34296 1726855346.91825: checking for max_fail_percentage 34296 1726855346.91826: done checking for max_fail_percentage 34296 1726855346.91827: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.91828: done checking to see if all hosts have failed 34296 1726855346.91828: getting the remaining hosts for this loop 34296 1726855346.91830: done getting the remaining hosts for this loop 34296 1726855346.91834: getting the next task for host managed_node1 34296 1726855346.91841: done getting next task for host managed_node1 34296 1726855346.91843: ^ task is: TASK: TEST: wireless connection with WPA-PSK 34296 1726855346.91846: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.91848: getting variables 34296 1726855346.91850: in VariableManager get_vars() 34296 1726855346.91901: Calling all_inventory to load vars for managed_node1 34296 1726855346.91904: Calling groups_inventory to load vars for managed_node1 34296 1726855346.91906: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.91918: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.91920: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.91923: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.92317: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000000d 34296 1726855346.92320: WORKER PROCESS EXITING 34296 1726855346.92343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.92539: done with get_vars() 34296 1726855346.92550: done getting variables 34296 1726855346.92610: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: wireless connection with WPA-PSK] ********************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:24 Friday 20 September 2024 14:02:26 -0400 (0:00:00.039) 0:00:02.964 ****** 34296 1726855346.92635: entering _queue_task() for managed_node1/debug 34296 1726855346.93101: worker is 1 (out of 1 available) 34296 1726855346.93108: exiting _queue_task() for managed_node1/debug 34296 1726855346.93118: done queuing things up, now waiting for results queue to drain 34296 1726855346.93119: waiting for pending results... 34296 1726855346.93171: running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with WPA-PSK 34296 1726855346.93264: in run() - task 0affcc66-ac2b-a97a-1acc-00000000000f 34296 1726855346.93289: variable 'ansible_search_path' from source: unknown 34296 1726855346.93329: calling self._execute() 34296 1726855346.93416: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.93427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.93442: variable 'omit' from source: magic vars 34296 1726855346.93821: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.93842: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855346.93960: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.93975: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855346.93984: when evaluation is False, skipping this task 34296 1726855346.93997: _execute() done 34296 1726855346.94005: dumping result to json 34296 1726855346.94013: done dumping result, returning 34296 1726855346.94025: done running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with WPA-PSK [0affcc66-ac2b-a97a-1acc-00000000000f] 34296 1726855346.94034: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000000f skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855346.94178: no more pending results, returning what we have 34296 1726855346.94182: results queue empty 34296 1726855346.94183: checking for any_errors_fatal 34296 1726855346.94192: done checking for any_errors_fatal 34296 1726855346.94193: checking for max_fail_percentage 34296 1726855346.94195: done checking for max_fail_percentage 34296 1726855346.94196: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.94197: done checking to see if all hosts have failed 34296 1726855346.94197: getting the remaining hosts for this loop 34296 1726855346.94199: done getting the remaining hosts for this loop 34296 1726855346.94202: getting the next task for host managed_node1 34296 1726855346.94210: done getting next task for host managed_node1 34296 1726855346.94215: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34296 1726855346.94219: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.94234: getting variables 34296 1726855346.94236: in VariableManager get_vars() 34296 1726855346.94390: Calling all_inventory to load vars for managed_node1 34296 1726855346.94394: Calling groups_inventory to load vars for managed_node1 34296 1726855346.94397: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.94402: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000000f 34296 1726855346.94405: WORKER PROCESS EXITING 34296 1726855346.94414: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.94417: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.94420: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.94747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.94947: done with get_vars() 34296 1726855346.94957: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:02:26 -0400 (0:00:00.026) 0:00:02.990 ****** 34296 1726855346.95260: entering _queue_task() for managed_node1/include_tasks 34296 1726855346.95752: worker is 1 (out of 1 available) 34296 1726855346.95768: exiting _queue_task() for managed_node1/include_tasks 34296 1726855346.95779: done queuing things up, now waiting for results queue to drain 34296 1726855346.95780: waiting for pending results... 34296 1726855346.96340: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34296 1726855346.96568: in run() - task 0affcc66-ac2b-a97a-1acc-000000000017 34296 1726855346.96578: variable 'ansible_search_path' from source: unknown 34296 1726855346.96582: variable 'ansible_search_path' from source: unknown 34296 1726855346.96711: calling self._execute() 34296 1726855346.96953: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855346.96957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855346.96969: variable 'omit' from source: magic vars 34296 1726855346.97782: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.98010: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855346.98171: variable 'ansible_distribution_major_version' from source: facts 34296 1726855346.98182: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855346.98191: when evaluation is False, skipping this task 34296 1726855346.98199: _execute() done 34296 1726855346.98205: dumping result to json 34296 1726855346.98212: done dumping result, returning 34296 1726855346.98223: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-a97a-1acc-000000000017] 34296 1726855346.98236: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000017 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855346.98447: no more pending results, returning what we have 34296 1726855346.98565: results queue empty 34296 1726855346.98577: checking for any_errors_fatal 34296 1726855346.98585: done checking for any_errors_fatal 34296 1726855346.98585: checking for max_fail_percentage 34296 1726855346.98589: done checking for max_fail_percentage 34296 1726855346.98589: checking to see if all hosts have failed and the running result is not ok 34296 1726855346.98590: done checking to see if all hosts have failed 34296 1726855346.98591: getting the remaining hosts for this loop 34296 1726855346.98593: done getting the remaining hosts for this loop 34296 1726855346.98597: getting the next task for host managed_node1 34296 1726855346.98604: done getting next task for host managed_node1 34296 1726855346.98609: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34296 1726855346.98612: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855346.98628: getting variables 34296 1726855346.98630: in VariableManager get_vars() 34296 1726855346.98811: Calling all_inventory to load vars for managed_node1 34296 1726855346.98814: Calling groups_inventory to load vars for managed_node1 34296 1726855346.98817: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855346.98823: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000017 34296 1726855346.98825: WORKER PROCESS EXITING 34296 1726855346.98836: Calling all_plugins_play to load vars for managed_node1 34296 1726855346.98840: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855346.98843: Calling groups_plugins_play to load vars for managed_node1 34296 1726855346.99237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855346.99461: done with get_vars() 34296 1726855346.99472: done getting variables 34296 1726855346.99531: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:02:26 -0400 (0:00:00.043) 0:00:03.033 ****** 34296 1726855346.99574: entering _queue_task() for managed_node1/debug 34296 1726855346.99990: worker is 1 (out of 1 available) 34296 1726855347.00003: exiting _queue_task() for managed_node1/debug 34296 1726855347.00012: done queuing things up, now waiting for results queue to drain 34296 1726855347.00013: waiting for pending results... 34296 1726855347.00169: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 34296 1726855347.00321: in run() - task 0affcc66-ac2b-a97a-1acc-000000000018 34296 1726855347.00344: variable 'ansible_search_path' from source: unknown 34296 1726855347.00354: variable 'ansible_search_path' from source: unknown 34296 1726855347.00429: calling self._execute() 34296 1726855347.00569: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.00592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.00637: variable 'omit' from source: magic vars 34296 1726855347.01035: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.01055: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.01194: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.01206: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.01219: when evaluation is False, skipping this task 34296 1726855347.01295: _execute() done 34296 1726855347.01298: dumping result to json 34296 1726855347.01301: done dumping result, returning 34296 1726855347.01303: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-a97a-1acc-000000000018] 34296 1726855347.01305: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000018 34296 1726855347.01370: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000018 34296 1726855347.01374: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855347.01440: no more pending results, returning what we have 34296 1726855347.01443: results queue empty 34296 1726855347.01444: checking for any_errors_fatal 34296 1726855347.01451: done checking for any_errors_fatal 34296 1726855347.01451: checking for max_fail_percentage 34296 1726855347.01454: done checking for max_fail_percentage 34296 1726855347.01455: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.01456: done checking to see if all hosts have failed 34296 1726855347.01456: getting the remaining hosts for this loop 34296 1726855347.01458: done getting the remaining hosts for this loop 34296 1726855347.01462: getting the next task for host managed_node1 34296 1726855347.01468: done getting next task for host managed_node1 34296 1726855347.01472: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34296 1726855347.01475: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.01493: getting variables 34296 1726855347.01495: in VariableManager get_vars() 34296 1726855347.01544: Calling all_inventory to load vars for managed_node1 34296 1726855347.01547: Calling groups_inventory to load vars for managed_node1 34296 1726855347.01550: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.01562: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.01565: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.01568: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.02040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.02263: done with get_vars() 34296 1726855347.02293: done getting variables 34296 1726855347.02448: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:02:27 -0400 (0:00:00.029) 0:00:03.062 ****** 34296 1726855347.02496: entering _queue_task() for managed_node1/fail 34296 1726855347.02498: Creating lock for fail 34296 1726855347.03066: worker is 1 (out of 1 available) 34296 1726855347.03079: exiting _queue_task() for managed_node1/fail 34296 1726855347.03093: done queuing things up, now waiting for results queue to drain 34296 1726855347.03094: waiting for pending results... 34296 1726855347.03805: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34296 1726855347.03811: in run() - task 0affcc66-ac2b-a97a-1acc-000000000019 34296 1726855347.03814: variable 'ansible_search_path' from source: unknown 34296 1726855347.03817: variable 'ansible_search_path' from source: unknown 34296 1726855347.04094: calling self._execute() 34296 1726855347.04098: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.04102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.04104: variable 'omit' from source: magic vars 34296 1726855347.05315: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.05398: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.05655: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.05671: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.05681: when evaluation is False, skipping this task 34296 1726855347.05693: _execute() done 34296 1726855347.05701: dumping result to json 34296 1726855347.05710: done dumping result, returning 34296 1726855347.05724: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-a97a-1acc-000000000019] 34296 1726855347.05734: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000019 34296 1726855347.06198: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000019 34296 1726855347.06201: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.06252: no more pending results, returning what we have 34296 1726855347.06256: results queue empty 34296 1726855347.06257: checking for any_errors_fatal 34296 1726855347.06262: done checking for any_errors_fatal 34296 1726855347.06263: checking for max_fail_percentage 34296 1726855347.06265: done checking for max_fail_percentage 34296 1726855347.06268: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.06269: done checking to see if all hosts have failed 34296 1726855347.06270: getting the remaining hosts for this loop 34296 1726855347.06271: done getting the remaining hosts for this loop 34296 1726855347.06275: getting the next task for host managed_node1 34296 1726855347.06282: done getting next task for host managed_node1 34296 1726855347.06286: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34296 1726855347.06292: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.06307: getting variables 34296 1726855347.06309: in VariableManager get_vars() 34296 1726855347.06361: Calling all_inventory to load vars for managed_node1 34296 1726855347.06364: Calling groups_inventory to load vars for managed_node1 34296 1726855347.06370: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.06384: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.06790: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.06797: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.07396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.07637: done with get_vars() 34296 1726855347.07647: done getting variables 34296 1726855347.07713: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:02:27 -0400 (0:00:00.052) 0:00:03.115 ****** 34296 1726855347.07746: entering _queue_task() for managed_node1/fail 34296 1726855347.08050: worker is 1 (out of 1 available) 34296 1726855347.08062: exiting _queue_task() for managed_node1/fail 34296 1726855347.08077: done queuing things up, now waiting for results queue to drain 34296 1726855347.08079: waiting for pending results... 34296 1726855347.08339: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34296 1726855347.08473: in run() - task 0affcc66-ac2b-a97a-1acc-00000000001a 34296 1726855347.08495: variable 'ansible_search_path' from source: unknown 34296 1726855347.08505: variable 'ansible_search_path' from source: unknown 34296 1726855347.08546: calling self._execute() 34296 1726855347.08638: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.08651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.08665: variable 'omit' from source: magic vars 34296 1726855347.09045: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.09070: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.09227: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.09238: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.09245: when evaluation is False, skipping this task 34296 1726855347.09251: _execute() done 34296 1726855347.09258: dumping result to json 34296 1726855347.09270: done dumping result, returning 34296 1726855347.09308: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-a97a-1acc-00000000001a] 34296 1726855347.09372: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000001a 34296 1726855347.09649: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000001a 34296 1726855347.09653: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.09744: no more pending results, returning what we have 34296 1726855347.09747: results queue empty 34296 1726855347.09748: checking for any_errors_fatal 34296 1726855347.09756: done checking for any_errors_fatal 34296 1726855347.09757: checking for max_fail_percentage 34296 1726855347.09758: done checking for max_fail_percentage 34296 1726855347.09759: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.09760: done checking to see if all hosts have failed 34296 1726855347.09761: getting the remaining hosts for this loop 34296 1726855347.09762: done getting the remaining hosts for this loop 34296 1726855347.09768: getting the next task for host managed_node1 34296 1726855347.09773: done getting next task for host managed_node1 34296 1726855347.09777: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34296 1726855347.09781: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.09796: getting variables 34296 1726855347.09798: in VariableManager get_vars() 34296 1726855347.09844: Calling all_inventory to load vars for managed_node1 34296 1726855347.09847: Calling groups_inventory to load vars for managed_node1 34296 1726855347.09849: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.09861: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.09863: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.09868: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.10156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.10346: done with get_vars() 34296 1726855347.10356: done getting variables 34296 1726855347.10419: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:02:27 -0400 (0:00:00.027) 0:00:03.142 ****** 34296 1726855347.10453: entering _queue_task() for managed_node1/fail 34296 1726855347.10913: worker is 1 (out of 1 available) 34296 1726855347.10920: exiting _queue_task() for managed_node1/fail 34296 1726855347.10929: done queuing things up, now waiting for results queue to drain 34296 1726855347.10930: waiting for pending results... 34296 1726855347.11057: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34296 1726855347.11135: in run() - task 0affcc66-ac2b-a97a-1acc-00000000001b 34296 1726855347.11158: variable 'ansible_search_path' from source: unknown 34296 1726855347.11168: variable 'ansible_search_path' from source: unknown 34296 1726855347.11207: calling self._execute() 34296 1726855347.11298: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.11374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.11377: variable 'omit' from source: magic vars 34296 1726855347.11759: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.11778: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.11897: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.11910: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.11917: when evaluation is False, skipping this task 34296 1726855347.11922: _execute() done 34296 1726855347.11928: dumping result to json 34296 1726855347.11934: done dumping result, returning 34296 1726855347.11944: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-a97a-1acc-00000000001b] 34296 1726855347.11952: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000001b 34296 1726855347.12234: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000001b 34296 1726855347.12237: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.12278: no more pending results, returning what we have 34296 1726855347.12281: results queue empty 34296 1726855347.12282: checking for any_errors_fatal 34296 1726855347.12286: done checking for any_errors_fatal 34296 1726855347.12289: checking for max_fail_percentage 34296 1726855347.12290: done checking for max_fail_percentage 34296 1726855347.12291: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.12292: done checking to see if all hosts have failed 34296 1726855347.12292: getting the remaining hosts for this loop 34296 1726855347.12294: done getting the remaining hosts for this loop 34296 1726855347.12297: getting the next task for host managed_node1 34296 1726855347.12302: done getting next task for host managed_node1 34296 1726855347.12305: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34296 1726855347.12308: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.12322: getting variables 34296 1726855347.12324: in VariableManager get_vars() 34296 1726855347.12363: Calling all_inventory to load vars for managed_node1 34296 1726855347.12368: Calling groups_inventory to load vars for managed_node1 34296 1726855347.12371: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.12380: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.12383: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.12385: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.12669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.12896: done with get_vars() 34296 1726855347.12905: done getting variables 34296 1726855347.13004: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:02:27 -0400 (0:00:00.025) 0:00:03.168 ****** 34296 1726855347.13036: entering _queue_task() for managed_node1/dnf 34296 1726855347.13330: worker is 1 (out of 1 available) 34296 1726855347.13498: exiting _queue_task() for managed_node1/dnf 34296 1726855347.13507: done queuing things up, now waiting for results queue to drain 34296 1726855347.13509: waiting for pending results... 34296 1726855347.14105: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34296 1726855347.14299: in run() - task 0affcc66-ac2b-a97a-1acc-00000000001c 34296 1726855347.14303: variable 'ansible_search_path' from source: unknown 34296 1726855347.14306: variable 'ansible_search_path' from source: unknown 34296 1726855347.14441: calling self._execute() 34296 1726855347.14483: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.14558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.14578: variable 'omit' from source: magic vars 34296 1726855347.15185: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.15227: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.15394: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.15408: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.15415: when evaluation is False, skipping this task 34296 1726855347.15421: _execute() done 34296 1726855347.15427: dumping result to json 34296 1726855347.15434: done dumping result, returning 34296 1726855347.15454: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-a97a-1acc-00000000001c] 34296 1726855347.15465: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000001c skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.15741: no more pending results, returning what we have 34296 1726855347.15745: results queue empty 34296 1726855347.15746: checking for any_errors_fatal 34296 1726855347.15752: done checking for any_errors_fatal 34296 1726855347.15753: checking for max_fail_percentage 34296 1726855347.15754: done checking for max_fail_percentage 34296 1726855347.15756: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.15756: done checking to see if all hosts have failed 34296 1726855347.15757: getting the remaining hosts for this loop 34296 1726855347.15758: done getting the remaining hosts for this loop 34296 1726855347.15762: getting the next task for host managed_node1 34296 1726855347.15771: done getting next task for host managed_node1 34296 1726855347.15775: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34296 1726855347.15778: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.15795: getting variables 34296 1726855347.15797: in VariableManager get_vars() 34296 1726855347.15849: Calling all_inventory to load vars for managed_node1 34296 1726855347.15852: Calling groups_inventory to load vars for managed_node1 34296 1726855347.15855: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.15869: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.15872: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.15875: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.16205: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000001c 34296 1726855347.16209: WORKER PROCESS EXITING 34296 1726855347.16230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.16445: done with get_vars() 34296 1726855347.16496: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34296 1726855347.16569: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:02:27 -0400 (0:00:00.035) 0:00:03.203 ****** 34296 1726855347.16604: entering _queue_task() for managed_node1/yum 34296 1726855347.16606: Creating lock for yum 34296 1726855347.17359: worker is 1 (out of 1 available) 34296 1726855347.17373: exiting _queue_task() for managed_node1/yum 34296 1726855347.17383: done queuing things up, now waiting for results queue to drain 34296 1726855347.17384: waiting for pending results... 34296 1726855347.17739: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34296 1726855347.18135: in run() - task 0affcc66-ac2b-a97a-1acc-00000000001d 34296 1726855347.18138: variable 'ansible_search_path' from source: unknown 34296 1726855347.18141: variable 'ansible_search_path' from source: unknown 34296 1726855347.18167: calling self._execute() 34296 1726855347.18351: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.18356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.18358: variable 'omit' from source: magic vars 34296 1726855347.19352: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.19373: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.19750: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.19811: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.19894: when evaluation is False, skipping this task 34296 1726855347.19898: _execute() done 34296 1726855347.19900: dumping result to json 34296 1726855347.19902: done dumping result, returning 34296 1726855347.19911: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-a97a-1acc-00000000001d] 34296 1726855347.19921: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000001d skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.20327: no more pending results, returning what we have 34296 1726855347.20331: results queue empty 34296 1726855347.20331: checking for any_errors_fatal 34296 1726855347.20337: done checking for any_errors_fatal 34296 1726855347.20338: checking for max_fail_percentage 34296 1726855347.20340: done checking for max_fail_percentage 34296 1726855347.20341: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.20341: done checking to see if all hosts have failed 34296 1726855347.20342: getting the remaining hosts for this loop 34296 1726855347.20343: done getting the remaining hosts for this loop 34296 1726855347.20347: getting the next task for host managed_node1 34296 1726855347.20353: done getting next task for host managed_node1 34296 1726855347.20356: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34296 1726855347.20359: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.20374: getting variables 34296 1726855347.20376: in VariableManager get_vars() 34296 1726855347.20424: Calling all_inventory to load vars for managed_node1 34296 1726855347.20426: Calling groups_inventory to load vars for managed_node1 34296 1726855347.20428: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.20438: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.20441: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.20444: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.20801: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000001d 34296 1726855347.20805: WORKER PROCESS EXITING 34296 1726855347.20830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.21235: done with get_vars() 34296 1726855347.21247: done getting variables 34296 1726855347.21341: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:02:27 -0400 (0:00:00.047) 0:00:03.251 ****** 34296 1726855347.21378: entering _queue_task() for managed_node1/fail 34296 1726855347.21695: worker is 1 (out of 1 available) 34296 1726855347.21709: exiting _queue_task() for managed_node1/fail 34296 1726855347.21722: done queuing things up, now waiting for results queue to drain 34296 1726855347.21723: waiting for pending results... 34296 1726855347.21959: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34296 1726855347.22086: in run() - task 0affcc66-ac2b-a97a-1acc-00000000001e 34296 1726855347.22109: variable 'ansible_search_path' from source: unknown 34296 1726855347.22119: variable 'ansible_search_path' from source: unknown 34296 1726855347.22165: calling self._execute() 34296 1726855347.22261: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.22271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.22286: variable 'omit' from source: magic vars 34296 1726855347.22778: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.22783: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.22820: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.22830: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.22837: when evaluation is False, skipping this task 34296 1726855347.22843: _execute() done 34296 1726855347.22849: dumping result to json 34296 1726855347.22856: done dumping result, returning 34296 1726855347.22867: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-a97a-1acc-00000000001e] 34296 1726855347.22877: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000001e skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.23049: no more pending results, returning what we have 34296 1726855347.23053: results queue empty 34296 1726855347.23055: checking for any_errors_fatal 34296 1726855347.23065: done checking for any_errors_fatal 34296 1726855347.23066: checking for max_fail_percentage 34296 1726855347.23068: done checking for max_fail_percentage 34296 1726855347.23069: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.23070: done checking to see if all hosts have failed 34296 1726855347.23071: getting the remaining hosts for this loop 34296 1726855347.23072: done getting the remaining hosts for this loop 34296 1726855347.23076: getting the next task for host managed_node1 34296 1726855347.23082: done getting next task for host managed_node1 34296 1726855347.23086: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34296 1726855347.23091: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.23219: getting variables 34296 1726855347.23222: in VariableManager get_vars() 34296 1726855347.23278: Calling all_inventory to load vars for managed_node1 34296 1726855347.23281: Calling groups_inventory to load vars for managed_node1 34296 1726855347.23283: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.23304: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.23307: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.23412: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.23731: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000001e 34296 1726855347.23735: WORKER PROCESS EXITING 34296 1726855347.23762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.23989: done with get_vars() 34296 1726855347.24000: done getting variables 34296 1726855347.24057: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:02:27 -0400 (0:00:00.027) 0:00:03.278 ****** 34296 1726855347.24098: entering _queue_task() for managed_node1/package 34296 1726855347.24516: worker is 1 (out of 1 available) 34296 1726855347.24527: exiting _queue_task() for managed_node1/package 34296 1726855347.24538: done queuing things up, now waiting for results queue to drain 34296 1726855347.24539: waiting for pending results... 34296 1726855347.24715: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 34296 1726855347.25161: in run() - task 0affcc66-ac2b-a97a-1acc-00000000001f 34296 1726855347.25165: variable 'ansible_search_path' from source: unknown 34296 1726855347.25168: variable 'ansible_search_path' from source: unknown 34296 1726855347.25170: calling self._execute() 34296 1726855347.25223: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.25278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.25298: variable 'omit' from source: magic vars 34296 1726855347.26190: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.26221: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.26437: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.26448: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.26456: when evaluation is False, skipping this task 34296 1726855347.26464: _execute() done 34296 1726855347.26470: dumping result to json 34296 1726855347.26479: done dumping result, returning 34296 1726855347.26501: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-a97a-1acc-00000000001f] 34296 1726855347.26512: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000001f skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.26776: no more pending results, returning what we have 34296 1726855347.26779: results queue empty 34296 1726855347.26781: checking for any_errors_fatal 34296 1726855347.26790: done checking for any_errors_fatal 34296 1726855347.26791: checking for max_fail_percentage 34296 1726855347.26793: done checking for max_fail_percentage 34296 1726855347.26794: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.26795: done checking to see if all hosts have failed 34296 1726855347.26796: getting the remaining hosts for this loop 34296 1726855347.26797: done getting the remaining hosts for this loop 34296 1726855347.26801: getting the next task for host managed_node1 34296 1726855347.26808: done getting next task for host managed_node1 34296 1726855347.26812: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34296 1726855347.26839: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.26856: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000001f 34296 1726855347.26866: WORKER PROCESS EXITING 34296 1726855347.26877: getting variables 34296 1726855347.26879: in VariableManager get_vars() 34296 1726855347.27019: Calling all_inventory to load vars for managed_node1 34296 1726855347.27022: Calling groups_inventory to load vars for managed_node1 34296 1726855347.27025: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.27153: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.27157: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.27161: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.27435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.27680: done with get_vars() 34296 1726855347.27699: done getting variables 34296 1726855347.27776: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:02:27 -0400 (0:00:00.037) 0:00:03.315 ****** 34296 1726855347.27827: entering _queue_task() for managed_node1/package 34296 1726855347.28177: worker is 1 (out of 1 available) 34296 1726855347.28196: exiting _queue_task() for managed_node1/package 34296 1726855347.28215: done queuing things up, now waiting for results queue to drain 34296 1726855347.28347: waiting for pending results... 34296 1726855347.28684: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34296 1726855347.29136: in run() - task 0affcc66-ac2b-a97a-1acc-000000000020 34296 1726855347.29141: variable 'ansible_search_path' from source: unknown 34296 1726855347.29143: variable 'ansible_search_path' from source: unknown 34296 1726855347.29146: calling self._execute() 34296 1726855347.29505: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.29516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.29529: variable 'omit' from source: magic vars 34296 1726855347.29995: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.30014: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.30133: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.30144: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.30151: when evaluation is False, skipping this task 34296 1726855347.30157: _execute() done 34296 1726855347.30163: dumping result to json 34296 1726855347.30173: done dumping result, returning 34296 1726855347.30185: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-a97a-1acc-000000000020] 34296 1726855347.30198: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000020 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.30357: no more pending results, returning what we have 34296 1726855347.30360: results queue empty 34296 1726855347.30361: checking for any_errors_fatal 34296 1726855347.30367: done checking for any_errors_fatal 34296 1726855347.30367: checking for max_fail_percentage 34296 1726855347.30369: done checking for max_fail_percentage 34296 1726855347.30370: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.30371: done checking to see if all hosts have failed 34296 1726855347.30371: getting the remaining hosts for this loop 34296 1726855347.30374: done getting the remaining hosts for this loop 34296 1726855347.30378: getting the next task for host managed_node1 34296 1726855347.30384: done getting next task for host managed_node1 34296 1726855347.30388: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34296 1726855347.30391: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.30406: getting variables 34296 1726855347.30408: in VariableManager get_vars() 34296 1726855347.30456: Calling all_inventory to load vars for managed_node1 34296 1726855347.30459: Calling groups_inventory to load vars for managed_node1 34296 1726855347.30461: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.30472: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.30475: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.30477: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.30801: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000020 34296 1726855347.30805: WORKER PROCESS EXITING 34296 1726855347.30828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.31039: done with get_vars() 34296 1726855347.31049: done getting variables 34296 1726855347.31118: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:02:27 -0400 (0:00:00.033) 0:00:03.349 ****** 34296 1726855347.31153: entering _queue_task() for managed_node1/package 34296 1726855347.31449: worker is 1 (out of 1 available) 34296 1726855347.31462: exiting _queue_task() for managed_node1/package 34296 1726855347.31475: done queuing things up, now waiting for results queue to drain 34296 1726855347.31476: waiting for pending results... 34296 1726855347.31737: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34296 1726855347.31868: in run() - task 0affcc66-ac2b-a97a-1acc-000000000021 34296 1726855347.31886: variable 'ansible_search_path' from source: unknown 34296 1726855347.31896: variable 'ansible_search_path' from source: unknown 34296 1726855347.31938: calling self._execute() 34296 1726855347.32023: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.32035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.32047: variable 'omit' from source: magic vars 34296 1726855347.32417: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.32434: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.32556: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.32574: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.32582: when evaluation is False, skipping this task 34296 1726855347.32590: _execute() done 34296 1726855347.32597: dumping result to json 34296 1726855347.32604: done dumping result, returning 34296 1726855347.32616: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-a97a-1acc-000000000021] 34296 1726855347.32624: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000021 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.32775: no more pending results, returning what we have 34296 1726855347.32778: results queue empty 34296 1726855347.32780: checking for any_errors_fatal 34296 1726855347.32786: done checking for any_errors_fatal 34296 1726855347.32789: checking for max_fail_percentage 34296 1726855347.32790: done checking for max_fail_percentage 34296 1726855347.32791: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.32792: done checking to see if all hosts have failed 34296 1726855347.32792: getting the remaining hosts for this loop 34296 1726855347.32793: done getting the remaining hosts for this loop 34296 1726855347.32797: getting the next task for host managed_node1 34296 1726855347.32804: done getting next task for host managed_node1 34296 1726855347.32807: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34296 1726855347.32811: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.32826: getting variables 34296 1726855347.32828: in VariableManager get_vars() 34296 1726855347.32881: Calling all_inventory to load vars for managed_node1 34296 1726855347.32884: Calling groups_inventory to load vars for managed_node1 34296 1726855347.32886: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.33156: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000021 34296 1726855347.33160: WORKER PROCESS EXITING 34296 1726855347.33174: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.33177: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.33180: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.33343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.33529: done with get_vars() 34296 1726855347.33541: done getting variables 34296 1726855347.33650: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:02:27 -0400 (0:00:00.025) 0:00:03.374 ****** 34296 1726855347.33688: entering _queue_task() for managed_node1/service 34296 1726855347.33690: Creating lock for service 34296 1726855347.34029: worker is 1 (out of 1 available) 34296 1726855347.34043: exiting _queue_task() for managed_node1/service 34296 1726855347.34053: done queuing things up, now waiting for results queue to drain 34296 1726855347.34054: waiting for pending results... 34296 1726855347.34406: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34296 1726855347.34421: in run() - task 0affcc66-ac2b-a97a-1acc-000000000022 34296 1726855347.34442: variable 'ansible_search_path' from source: unknown 34296 1726855347.34451: variable 'ansible_search_path' from source: unknown 34296 1726855347.34496: calling self._execute() 34296 1726855347.34590: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.34603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.34623: variable 'omit' from source: magic vars 34296 1726855347.35003: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.35021: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.35146: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.35162: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.35173: when evaluation is False, skipping this task 34296 1726855347.35180: _execute() done 34296 1726855347.35188: dumping result to json 34296 1726855347.35197: done dumping result, returning 34296 1726855347.35209: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-a97a-1acc-000000000022] 34296 1726855347.35218: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000022 34296 1726855347.35340: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000022 34296 1726855347.35344: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.35419: no more pending results, returning what we have 34296 1726855347.35423: results queue empty 34296 1726855347.35424: checking for any_errors_fatal 34296 1726855347.35431: done checking for any_errors_fatal 34296 1726855347.35432: checking for max_fail_percentage 34296 1726855347.35434: done checking for max_fail_percentage 34296 1726855347.35435: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.35435: done checking to see if all hosts have failed 34296 1726855347.35436: getting the remaining hosts for this loop 34296 1726855347.35437: done getting the remaining hosts for this loop 34296 1726855347.35441: getting the next task for host managed_node1 34296 1726855347.35448: done getting next task for host managed_node1 34296 1726855347.35452: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34296 1726855347.35457: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.35477: getting variables 34296 1726855347.35479: in VariableManager get_vars() 34296 1726855347.35531: Calling all_inventory to load vars for managed_node1 34296 1726855347.35535: Calling groups_inventory to load vars for managed_node1 34296 1726855347.35538: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.35550: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.35554: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.35557: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.36054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.36259: done with get_vars() 34296 1726855347.36272: done getting variables 34296 1726855347.36330: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:02:27 -0400 (0:00:00.026) 0:00:03.401 ****** 34296 1726855347.36360: entering _queue_task() for managed_node1/service 34296 1726855347.36814: worker is 1 (out of 1 available) 34296 1726855347.36823: exiting _queue_task() for managed_node1/service 34296 1726855347.36832: done queuing things up, now waiting for results queue to drain 34296 1726855347.36833: waiting for pending results... 34296 1726855347.37004: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34296 1726855347.37059: in run() - task 0affcc66-ac2b-a97a-1acc-000000000023 34296 1726855347.37062: variable 'ansible_search_path' from source: unknown 34296 1726855347.37074: variable 'ansible_search_path' from source: unknown 34296 1726855347.37114: calling self._execute() 34296 1726855347.37276: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.37280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.37284: variable 'omit' from source: magic vars 34296 1726855347.37598: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.37618: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.37739: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.37750: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.37758: when evaluation is False, skipping this task 34296 1726855347.37768: _execute() done 34296 1726855347.37777: dumping result to json 34296 1726855347.37786: done dumping result, returning 34296 1726855347.37801: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-a97a-1acc-000000000023] 34296 1726855347.37812: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000023 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34296 1726855347.38034: no more pending results, returning what we have 34296 1726855347.38038: results queue empty 34296 1726855347.38039: checking for any_errors_fatal 34296 1726855347.38046: done checking for any_errors_fatal 34296 1726855347.38047: checking for max_fail_percentage 34296 1726855347.38049: done checking for max_fail_percentage 34296 1726855347.38049: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.38050: done checking to see if all hosts have failed 34296 1726855347.38051: getting the remaining hosts for this loop 34296 1726855347.38052: done getting the remaining hosts for this loop 34296 1726855347.38056: getting the next task for host managed_node1 34296 1726855347.38063: done getting next task for host managed_node1 34296 1726855347.38069: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34296 1726855347.38073: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.38091: getting variables 34296 1726855347.38093: in VariableManager get_vars() 34296 1726855347.38141: Calling all_inventory to load vars for managed_node1 34296 1726855347.38144: Calling groups_inventory to load vars for managed_node1 34296 1726855347.38147: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.38158: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.38161: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.38165: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.38506: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000023 34296 1726855347.38509: WORKER PROCESS EXITING 34296 1726855347.38533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.38739: done with get_vars() 34296 1726855347.38750: done getting variables 34296 1726855347.38814: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:02:27 -0400 (0:00:00.024) 0:00:03.426 ****** 34296 1726855347.38849: entering _queue_task() for managed_node1/service 34296 1726855347.39307: worker is 1 (out of 1 available) 34296 1726855347.39316: exiting _queue_task() for managed_node1/service 34296 1726855347.39326: done queuing things up, now waiting for results queue to drain 34296 1726855347.39327: waiting for pending results... 34296 1726855347.39395: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34296 1726855347.39527: in run() - task 0affcc66-ac2b-a97a-1acc-000000000024 34296 1726855347.39547: variable 'ansible_search_path' from source: unknown 34296 1726855347.39560: variable 'ansible_search_path' from source: unknown 34296 1726855347.39605: calling self._execute() 34296 1726855347.39696: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.39708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.39724: variable 'omit' from source: magic vars 34296 1726855347.40164: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.40184: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.40316: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.40426: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.40429: when evaluation is False, skipping this task 34296 1726855347.40432: _execute() done 34296 1726855347.40435: dumping result to json 34296 1726855347.40437: done dumping result, returning 34296 1726855347.40439: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-a97a-1acc-000000000024] 34296 1726855347.40442: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000024 34296 1726855347.40514: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000024 34296 1726855347.40517: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.40576: no more pending results, returning what we have 34296 1726855347.40580: results queue empty 34296 1726855347.40581: checking for any_errors_fatal 34296 1726855347.40589: done checking for any_errors_fatal 34296 1726855347.40589: checking for max_fail_percentage 34296 1726855347.40591: done checking for max_fail_percentage 34296 1726855347.40592: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.40593: done checking to see if all hosts have failed 34296 1726855347.40594: getting the remaining hosts for this loop 34296 1726855347.40595: done getting the remaining hosts for this loop 34296 1726855347.40599: getting the next task for host managed_node1 34296 1726855347.40606: done getting next task for host managed_node1 34296 1726855347.40610: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34296 1726855347.40614: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.40629: getting variables 34296 1726855347.40631: in VariableManager get_vars() 34296 1726855347.40683: Calling all_inventory to load vars for managed_node1 34296 1726855347.40686: Calling groups_inventory to load vars for managed_node1 34296 1726855347.40793: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.40803: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.40806: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.40809: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.41103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.41307: done with get_vars() 34296 1726855347.41317: done getting variables 34296 1726855347.41377: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:02:27 -0400 (0:00:00.025) 0:00:03.451 ****** 34296 1726855347.41408: entering _queue_task() for managed_node1/service 34296 1726855347.41658: worker is 1 (out of 1 available) 34296 1726855347.41674: exiting _queue_task() for managed_node1/service 34296 1726855347.41685: done queuing things up, now waiting for results queue to drain 34296 1726855347.41791: waiting for pending results... 34296 1726855347.41944: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 34296 1726855347.42294: in run() - task 0affcc66-ac2b-a97a-1acc-000000000025 34296 1726855347.42298: variable 'ansible_search_path' from source: unknown 34296 1726855347.42301: variable 'ansible_search_path' from source: unknown 34296 1726855347.42304: calling self._execute() 34296 1726855347.42306: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.42309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.42312: variable 'omit' from source: magic vars 34296 1726855347.42602: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.42620: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.42738: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.42754: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.42762: when evaluation is False, skipping this task 34296 1726855347.42773: _execute() done 34296 1726855347.42781: dumping result to json 34296 1726855347.42791: done dumping result, returning 34296 1726855347.42803: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-a97a-1acc-000000000025] 34296 1726855347.42812: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000025 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34296 1726855347.43033: no more pending results, returning what we have 34296 1726855347.43037: results queue empty 34296 1726855347.43038: checking for any_errors_fatal 34296 1726855347.43046: done checking for any_errors_fatal 34296 1726855347.43047: checking for max_fail_percentage 34296 1726855347.43048: done checking for max_fail_percentage 34296 1726855347.43049: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.43050: done checking to see if all hosts have failed 34296 1726855347.43051: getting the remaining hosts for this loop 34296 1726855347.43052: done getting the remaining hosts for this loop 34296 1726855347.43057: getting the next task for host managed_node1 34296 1726855347.43063: done getting next task for host managed_node1 34296 1726855347.43071: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34296 1726855347.43074: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.43093: getting variables 34296 1726855347.43095: in VariableManager get_vars() 34296 1726855347.43143: Calling all_inventory to load vars for managed_node1 34296 1726855347.43147: Calling groups_inventory to load vars for managed_node1 34296 1726855347.43149: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.43161: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.43164: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.43170: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.43501: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000025 34296 1726855347.43505: WORKER PROCESS EXITING 34296 1726855347.43529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.43740: done with get_vars() 34296 1726855347.43750: done getting variables 34296 1726855347.43813: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:02:27 -0400 (0:00:00.024) 0:00:03.476 ****** 34296 1726855347.43847: entering _queue_task() for managed_node1/copy 34296 1726855347.44114: worker is 1 (out of 1 available) 34296 1726855347.44128: exiting _queue_task() for managed_node1/copy 34296 1726855347.44139: done queuing things up, now waiting for results queue to drain 34296 1726855347.44140: waiting for pending results... 34296 1726855347.44398: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34296 1726855347.44530: in run() - task 0affcc66-ac2b-a97a-1acc-000000000026 34296 1726855347.44550: variable 'ansible_search_path' from source: unknown 34296 1726855347.44559: variable 'ansible_search_path' from source: unknown 34296 1726855347.44603: calling self._execute() 34296 1726855347.44696: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.44709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.44724: variable 'omit' from source: magic vars 34296 1726855347.45178: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.45198: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.45318: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.45329: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.45385: when evaluation is False, skipping this task 34296 1726855347.45390: _execute() done 34296 1726855347.45392: dumping result to json 34296 1726855347.45395: done dumping result, returning 34296 1726855347.45398: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-a97a-1acc-000000000026] 34296 1726855347.45401: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000026 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.45723: no more pending results, returning what we have 34296 1726855347.45726: results queue empty 34296 1726855347.45727: checking for any_errors_fatal 34296 1726855347.45732: done checking for any_errors_fatal 34296 1726855347.45733: checking for max_fail_percentage 34296 1726855347.45735: done checking for max_fail_percentage 34296 1726855347.45735: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.45736: done checking to see if all hosts have failed 34296 1726855347.45737: getting the remaining hosts for this loop 34296 1726855347.45738: done getting the remaining hosts for this loop 34296 1726855347.45741: getting the next task for host managed_node1 34296 1726855347.45746: done getting next task for host managed_node1 34296 1726855347.45750: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34296 1726855347.45753: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.45769: getting variables 34296 1726855347.45771: in VariableManager get_vars() 34296 1726855347.45814: Calling all_inventory to load vars for managed_node1 34296 1726855347.45817: Calling groups_inventory to load vars for managed_node1 34296 1726855347.45819: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.45828: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.45831: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.45834: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.46078: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000026 34296 1726855347.46082: WORKER PROCESS EXITING 34296 1726855347.46110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.46316: done with get_vars() 34296 1726855347.46326: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:02:27 -0400 (0:00:00.025) 0:00:03.501 ****** 34296 1726855347.46411: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34296 1726855347.46413: Creating lock for fedora.linux_system_roles.network_connections 34296 1726855347.46680: worker is 1 (out of 1 available) 34296 1726855347.46895: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34296 1726855347.46904: done queuing things up, now waiting for results queue to drain 34296 1726855347.46905: waiting for pending results... 34296 1726855347.46958: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34296 1726855347.47090: in run() - task 0affcc66-ac2b-a97a-1acc-000000000027 34296 1726855347.47112: variable 'ansible_search_path' from source: unknown 34296 1726855347.47121: variable 'ansible_search_path' from source: unknown 34296 1726855347.47163: calling self._execute() 34296 1726855347.47255: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.47269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.47286: variable 'omit' from source: magic vars 34296 1726855347.47662: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.47689: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.47809: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.47820: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.47827: when evaluation is False, skipping this task 34296 1726855347.47835: _execute() done 34296 1726855347.47841: dumping result to json 34296 1726855347.47849: done dumping result, returning 34296 1726855347.47862: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-a97a-1acc-000000000027] 34296 1726855347.47875: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000027 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.48048: no more pending results, returning what we have 34296 1726855347.48052: results queue empty 34296 1726855347.48053: checking for any_errors_fatal 34296 1726855347.48059: done checking for any_errors_fatal 34296 1726855347.48060: checking for max_fail_percentage 34296 1726855347.48062: done checking for max_fail_percentage 34296 1726855347.48063: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.48064: done checking to see if all hosts have failed 34296 1726855347.48064: getting the remaining hosts for this loop 34296 1726855347.48069: done getting the remaining hosts for this loop 34296 1726855347.48073: getting the next task for host managed_node1 34296 1726855347.48079: done getting next task for host managed_node1 34296 1726855347.48084: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34296 1726855347.48090: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.48105: getting variables 34296 1726855347.48107: in VariableManager get_vars() 34296 1726855347.48155: Calling all_inventory to load vars for managed_node1 34296 1726855347.48158: Calling groups_inventory to load vars for managed_node1 34296 1726855347.48160: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.48175: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.48178: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.48182: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.48628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.49291: done with get_vars() 34296 1726855347.49304: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000027 34296 1726855347.49308: WORKER PROCESS EXITING 34296 1726855347.49315: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:02:27 -0400 (0:00:00.029) 0:00:03.531 ****** 34296 1726855347.49407: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34296 1726855347.49409: Creating lock for fedora.linux_system_roles.network_state 34296 1726855347.49817: worker is 1 (out of 1 available) 34296 1726855347.49830: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34296 1726855347.49841: done queuing things up, now waiting for results queue to drain 34296 1726855347.49842: waiting for pending results... 34296 1726855347.50020: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 34296 1726855347.50159: in run() - task 0affcc66-ac2b-a97a-1acc-000000000028 34296 1726855347.50186: variable 'ansible_search_path' from source: unknown 34296 1726855347.50201: variable 'ansible_search_path' from source: unknown 34296 1726855347.50242: calling self._execute() 34296 1726855347.50331: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.50343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.50472: variable 'omit' from source: magic vars 34296 1726855347.50752: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.50772: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.50917: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.50929: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.50939: when evaluation is False, skipping this task 34296 1726855347.50947: _execute() done 34296 1726855347.50956: dumping result to json 34296 1726855347.50965: done dumping result, returning 34296 1726855347.50978: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-a97a-1acc-000000000028] 34296 1726855347.50992: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000028 34296 1726855347.51256: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000028 34296 1726855347.51261: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.51317: no more pending results, returning what we have 34296 1726855347.51321: results queue empty 34296 1726855347.51322: checking for any_errors_fatal 34296 1726855347.51330: done checking for any_errors_fatal 34296 1726855347.51331: checking for max_fail_percentage 34296 1726855347.51332: done checking for max_fail_percentage 34296 1726855347.51333: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.51334: done checking to see if all hosts have failed 34296 1726855347.51334: getting the remaining hosts for this loop 34296 1726855347.51336: done getting the remaining hosts for this loop 34296 1726855347.51340: getting the next task for host managed_node1 34296 1726855347.51346: done getting next task for host managed_node1 34296 1726855347.51350: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34296 1726855347.51353: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.51372: getting variables 34296 1726855347.51374: in VariableManager get_vars() 34296 1726855347.51422: Calling all_inventory to load vars for managed_node1 34296 1726855347.51425: Calling groups_inventory to load vars for managed_node1 34296 1726855347.51427: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.51436: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.51439: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.51441: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.51810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.52220: done with get_vars() 34296 1726855347.52230: done getting variables 34296 1726855347.52349: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:02:27 -0400 (0:00:00.029) 0:00:03.561 ****** 34296 1726855347.52384: entering _queue_task() for managed_node1/debug 34296 1726855347.52920: worker is 1 (out of 1 available) 34296 1726855347.52934: exiting _queue_task() for managed_node1/debug 34296 1726855347.52946: done queuing things up, now waiting for results queue to drain 34296 1726855347.52947: waiting for pending results... 34296 1726855347.53315: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34296 1726855347.53357: in run() - task 0affcc66-ac2b-a97a-1acc-000000000029 34296 1726855347.53383: variable 'ansible_search_path' from source: unknown 34296 1726855347.53396: variable 'ansible_search_path' from source: unknown 34296 1726855347.53442: calling self._execute() 34296 1726855347.53539: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.53552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.53572: variable 'omit' from source: magic vars 34296 1726855347.53969: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.54028: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.54097: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.54103: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.54108: when evaluation is False, skipping this task 34296 1726855347.54114: _execute() done 34296 1726855347.54117: dumping result to json 34296 1726855347.54119: done dumping result, returning 34296 1726855347.54122: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-a97a-1acc-000000000029] 34296 1726855347.54127: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000029 34296 1726855347.54213: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000029 34296 1726855347.54216: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855347.54269: no more pending results, returning what we have 34296 1726855347.54272: results queue empty 34296 1726855347.54273: checking for any_errors_fatal 34296 1726855347.54280: done checking for any_errors_fatal 34296 1726855347.54281: checking for max_fail_percentage 34296 1726855347.54283: done checking for max_fail_percentage 34296 1726855347.54284: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.54284: done checking to see if all hosts have failed 34296 1726855347.54285: getting the remaining hosts for this loop 34296 1726855347.54289: done getting the remaining hosts for this loop 34296 1726855347.54293: getting the next task for host managed_node1 34296 1726855347.54298: done getting next task for host managed_node1 34296 1726855347.54302: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34296 1726855347.54305: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.54317: getting variables 34296 1726855347.54318: in VariableManager get_vars() 34296 1726855347.54364: Calling all_inventory to load vars for managed_node1 34296 1726855347.54367: Calling groups_inventory to load vars for managed_node1 34296 1726855347.54369: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.54379: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.54382: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.54384: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.54536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.54684: done with get_vars() 34296 1726855347.54704: done getting variables 34296 1726855347.54759: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:02:27 -0400 (0:00:00.024) 0:00:03.585 ****** 34296 1726855347.54794: entering _queue_task() for managed_node1/debug 34296 1726855347.55035: worker is 1 (out of 1 available) 34296 1726855347.55047: exiting _queue_task() for managed_node1/debug 34296 1726855347.55059: done queuing things up, now waiting for results queue to drain 34296 1726855347.55060: waiting for pending results... 34296 1726855347.55402: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34296 1726855347.55431: in run() - task 0affcc66-ac2b-a97a-1acc-00000000002a 34296 1726855347.55466: variable 'ansible_search_path' from source: unknown 34296 1726855347.55479: variable 'ansible_search_path' from source: unknown 34296 1726855347.55522: calling self._execute() 34296 1726855347.55686: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.55692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.55695: variable 'omit' from source: magic vars 34296 1726855347.56052: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.56071: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.56183: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.56188: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.56192: when evaluation is False, skipping this task 34296 1726855347.56195: _execute() done 34296 1726855347.56198: dumping result to json 34296 1726855347.56202: done dumping result, returning 34296 1726855347.56209: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-a97a-1acc-00000000002a] 34296 1726855347.56215: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000002a skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855347.56355: no more pending results, returning what we have 34296 1726855347.56358: results queue empty 34296 1726855347.56359: checking for any_errors_fatal 34296 1726855347.56365: done checking for any_errors_fatal 34296 1726855347.56365: checking for max_fail_percentage 34296 1726855347.56367: done checking for max_fail_percentage 34296 1726855347.56367: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.56368: done checking to see if all hosts have failed 34296 1726855347.56369: getting the remaining hosts for this loop 34296 1726855347.56370: done getting the remaining hosts for this loop 34296 1726855347.56374: getting the next task for host managed_node1 34296 1726855347.56381: done getting next task for host managed_node1 34296 1726855347.56385: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34296 1726855347.56390: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.56404: getting variables 34296 1726855347.56405: in VariableManager get_vars() 34296 1726855347.56441: Calling all_inventory to load vars for managed_node1 34296 1726855347.56444: Calling groups_inventory to load vars for managed_node1 34296 1726855347.56446: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.56454: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.56456: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.56458: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.56577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.56699: done with get_vars() 34296 1726855347.56707: done getting variables 34296 1726855347.56752: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000002a 34296 1726855347.56755: WORKER PROCESS EXITING 34296 1726855347.56765: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:02:27 -0400 (0:00:00.019) 0:00:03.605 ****** 34296 1726855347.56786: entering _queue_task() for managed_node1/debug 34296 1726855347.56975: worker is 1 (out of 1 available) 34296 1726855347.56992: exiting _queue_task() for managed_node1/debug 34296 1726855347.57002: done queuing things up, now waiting for results queue to drain 34296 1726855347.57003: waiting for pending results... 34296 1726855347.57169: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34296 1726855347.57244: in run() - task 0affcc66-ac2b-a97a-1acc-00000000002b 34296 1726855347.57256: variable 'ansible_search_path' from source: unknown 34296 1726855347.57264: variable 'ansible_search_path' from source: unknown 34296 1726855347.57296: calling self._execute() 34296 1726855347.57354: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.57358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.57375: variable 'omit' from source: magic vars 34296 1726855347.57630: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.57640: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.57741: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.57745: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.57748: when evaluation is False, skipping this task 34296 1726855347.57751: _execute() done 34296 1726855347.57753: dumping result to json 34296 1726855347.57755: done dumping result, returning 34296 1726855347.57762: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-a97a-1acc-00000000002b] 34296 1726855347.57811: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000002b 34296 1726855347.57874: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000002b 34296 1726855347.57878: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855347.57955: no more pending results, returning what we have 34296 1726855347.57958: results queue empty 34296 1726855347.57959: checking for any_errors_fatal 34296 1726855347.57963: done checking for any_errors_fatal 34296 1726855347.57963: checking for max_fail_percentage 34296 1726855347.57965: done checking for max_fail_percentage 34296 1726855347.57965: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.57966: done checking to see if all hosts have failed 34296 1726855347.57967: getting the remaining hosts for this loop 34296 1726855347.57968: done getting the remaining hosts for this loop 34296 1726855347.57970: getting the next task for host managed_node1 34296 1726855347.57975: done getting next task for host managed_node1 34296 1726855347.57979: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34296 1726855347.57981: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.57996: getting variables 34296 1726855347.57997: in VariableManager get_vars() 34296 1726855347.58035: Calling all_inventory to load vars for managed_node1 34296 1726855347.58037: Calling groups_inventory to load vars for managed_node1 34296 1726855347.58039: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.58045: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.58046: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.58048: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.58229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.58422: done with get_vars() 34296 1726855347.58431: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:02:27 -0400 (0:00:00.017) 0:00:03.622 ****** 34296 1726855347.58519: entering _queue_task() for managed_node1/ping 34296 1726855347.58521: Creating lock for ping 34296 1726855347.58760: worker is 1 (out of 1 available) 34296 1726855347.58771: exiting _queue_task() for managed_node1/ping 34296 1726855347.58781: done queuing things up, now waiting for results queue to drain 34296 1726855347.58782: waiting for pending results... 34296 1726855347.59024: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 34296 1726855347.59118: in run() - task 0affcc66-ac2b-a97a-1acc-00000000002c 34296 1726855347.59128: variable 'ansible_search_path' from source: unknown 34296 1726855347.59131: variable 'ansible_search_path' from source: unknown 34296 1726855347.59166: calling self._execute() 34296 1726855347.59235: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.59264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.59280: variable 'omit' from source: magic vars 34296 1726855347.59792: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.59796: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.59798: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.59801: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.59803: when evaluation is False, skipping this task 34296 1726855347.59806: _execute() done 34296 1726855347.59808: dumping result to json 34296 1726855347.59810: done dumping result, returning 34296 1726855347.59812: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-a97a-1acc-00000000002c] 34296 1726855347.59814: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000002c 34296 1726855347.59876: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000002c 34296 1726855347.59879: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.59932: no more pending results, returning what we have 34296 1726855347.59935: results queue empty 34296 1726855347.59936: checking for any_errors_fatal 34296 1726855347.59942: done checking for any_errors_fatal 34296 1726855347.59942: checking for max_fail_percentage 34296 1726855347.59944: done checking for max_fail_percentage 34296 1726855347.59945: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.59946: done checking to see if all hosts have failed 34296 1726855347.59946: getting the remaining hosts for this loop 34296 1726855347.59948: done getting the remaining hosts for this loop 34296 1726855347.59951: getting the next task for host managed_node1 34296 1726855347.59960: done getting next task for host managed_node1 34296 1726855347.59963: ^ task is: TASK: meta (role_complete) 34296 1726855347.59966: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.59982: getting variables 34296 1726855347.59984: in VariableManager get_vars() 34296 1726855347.60035: Calling all_inventory to load vars for managed_node1 34296 1726855347.60038: Calling groups_inventory to load vars for managed_node1 34296 1726855347.60041: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.60051: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.60054: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.60057: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.60431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.60595: done with get_vars() 34296 1726855347.60603: done getting variables 34296 1726855347.60657: done queuing things up, now waiting for results queue to drain 34296 1726855347.60658: results queue empty 34296 1726855347.60663: checking for any_errors_fatal 34296 1726855347.60666: done checking for any_errors_fatal 34296 1726855347.60667: checking for max_fail_percentage 34296 1726855347.60668: done checking for max_fail_percentage 34296 1726855347.60668: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.60669: done checking to see if all hosts have failed 34296 1726855347.60670: getting the remaining hosts for this loop 34296 1726855347.60670: done getting the remaining hosts for this loop 34296 1726855347.60673: getting the next task for host managed_node1 34296 1726855347.60677: done getting next task for host managed_node1 34296 1726855347.60680: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34296 1726855347.60681: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.60691: getting variables 34296 1726855347.60692: in VariableManager get_vars() 34296 1726855347.60705: Calling all_inventory to load vars for managed_node1 34296 1726855347.60707: Calling groups_inventory to load vars for managed_node1 34296 1726855347.60708: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.60711: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.60712: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.60714: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.60797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.60934: done with get_vars() 34296 1726855347.60940: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:02:27 -0400 (0:00:00.024) 0:00:03.647 ****** 34296 1726855347.60992: entering _queue_task() for managed_node1/include_tasks 34296 1726855347.61216: worker is 1 (out of 1 available) 34296 1726855347.61229: exiting _queue_task() for managed_node1/include_tasks 34296 1726855347.61239: done queuing things up, now waiting for results queue to drain 34296 1726855347.61240: waiting for pending results... 34296 1726855347.61412: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34296 1726855347.61495: in run() - task 0affcc66-ac2b-a97a-1acc-000000000063 34296 1726855347.61506: variable 'ansible_search_path' from source: unknown 34296 1726855347.61510: variable 'ansible_search_path' from source: unknown 34296 1726855347.61538: calling self._execute() 34296 1726855347.61604: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.61608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.61616: variable 'omit' from source: magic vars 34296 1726855347.61883: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.61895: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.61977: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.61981: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.61984: when evaluation is False, skipping this task 34296 1726855347.61989: _execute() done 34296 1726855347.61992: dumping result to json 34296 1726855347.61995: done dumping result, returning 34296 1726855347.62002: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-a97a-1acc-000000000063] 34296 1726855347.62007: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000063 34296 1726855347.62092: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000063 34296 1726855347.62095: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.62162: no more pending results, returning what we have 34296 1726855347.62165: results queue empty 34296 1726855347.62166: checking for any_errors_fatal 34296 1726855347.62167: done checking for any_errors_fatal 34296 1726855347.62168: checking for max_fail_percentage 34296 1726855347.62169: done checking for max_fail_percentage 34296 1726855347.62169: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.62170: done checking to see if all hosts have failed 34296 1726855347.62171: getting the remaining hosts for this loop 34296 1726855347.62172: done getting the remaining hosts for this loop 34296 1726855347.62176: getting the next task for host managed_node1 34296 1726855347.62181: done getting next task for host managed_node1 34296 1726855347.62185: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34296 1726855347.62189: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.62203: getting variables 34296 1726855347.62205: in VariableManager get_vars() 34296 1726855347.62243: Calling all_inventory to load vars for managed_node1 34296 1726855347.62245: Calling groups_inventory to load vars for managed_node1 34296 1726855347.62257: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.62265: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.62268: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.62271: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.62447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.62655: done with get_vars() 34296 1726855347.62665: done getting variables 34296 1726855347.62733: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:02:27 -0400 (0:00:00.017) 0:00:03.665 ****** 34296 1726855347.62763: entering _queue_task() for managed_node1/debug 34296 1726855347.63099: worker is 1 (out of 1 available) 34296 1726855347.63112: exiting _queue_task() for managed_node1/debug 34296 1726855347.63123: done queuing things up, now waiting for results queue to drain 34296 1726855347.63124: waiting for pending results... 34296 1726855347.63355: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 34296 1726855347.63464: in run() - task 0affcc66-ac2b-a97a-1acc-000000000064 34296 1726855347.63477: variable 'ansible_search_path' from source: unknown 34296 1726855347.63484: variable 'ansible_search_path' from source: unknown 34296 1726855347.63606: calling self._execute() 34296 1726855347.63771: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.63776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.63778: variable 'omit' from source: magic vars 34296 1726855347.64026: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.64037: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.64119: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.64122: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.64125: when evaluation is False, skipping this task 34296 1726855347.64128: _execute() done 34296 1726855347.64130: dumping result to json 34296 1726855347.64135: done dumping result, returning 34296 1726855347.64142: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-a97a-1acc-000000000064] 34296 1726855347.64148: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000064 34296 1726855347.64230: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000064 34296 1726855347.64233: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855347.64302: no more pending results, returning what we have 34296 1726855347.64305: results queue empty 34296 1726855347.64306: checking for any_errors_fatal 34296 1726855347.64313: done checking for any_errors_fatal 34296 1726855347.64313: checking for max_fail_percentage 34296 1726855347.64315: done checking for max_fail_percentage 34296 1726855347.64315: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.64316: done checking to see if all hosts have failed 34296 1726855347.64317: getting the remaining hosts for this loop 34296 1726855347.64319: done getting the remaining hosts for this loop 34296 1726855347.64322: getting the next task for host managed_node1 34296 1726855347.64327: done getting next task for host managed_node1 34296 1726855347.64331: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34296 1726855347.64333: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.64350: getting variables 34296 1726855347.64352: in VariableManager get_vars() 34296 1726855347.64390: Calling all_inventory to load vars for managed_node1 34296 1726855347.64393: Calling groups_inventory to load vars for managed_node1 34296 1726855347.64395: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.64403: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.64406: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.64408: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.64718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.64837: done with get_vars() 34296 1726855347.64844: done getting variables 34296 1726855347.64891: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:02:27 -0400 (0:00:00.021) 0:00:03.686 ****** 34296 1726855347.64913: entering _queue_task() for managed_node1/fail 34296 1726855347.65121: worker is 1 (out of 1 available) 34296 1726855347.65135: exiting _queue_task() for managed_node1/fail 34296 1726855347.65145: done queuing things up, now waiting for results queue to drain 34296 1726855347.65146: waiting for pending results... 34296 1726855347.65327: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34296 1726855347.65410: in run() - task 0affcc66-ac2b-a97a-1acc-000000000065 34296 1726855347.65423: variable 'ansible_search_path' from source: unknown 34296 1726855347.65426: variable 'ansible_search_path' from source: unknown 34296 1726855347.65454: calling self._execute() 34296 1726855347.65593: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.65596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.65599: variable 'omit' from source: magic vars 34296 1726855347.65931: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.65949: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.66068: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.66079: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.66086: when evaluation is False, skipping this task 34296 1726855347.66094: _execute() done 34296 1726855347.66100: dumping result to json 34296 1726855347.66107: done dumping result, returning 34296 1726855347.66117: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-a97a-1acc-000000000065] 34296 1726855347.66125: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000065 34296 1726855347.66455: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000065 34296 1726855347.66458: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.66522: no more pending results, returning what we have 34296 1726855347.66526: results queue empty 34296 1726855347.66526: checking for any_errors_fatal 34296 1726855347.66532: done checking for any_errors_fatal 34296 1726855347.66533: checking for max_fail_percentage 34296 1726855347.66534: done checking for max_fail_percentage 34296 1726855347.66535: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.66536: done checking to see if all hosts have failed 34296 1726855347.66537: getting the remaining hosts for this loop 34296 1726855347.66538: done getting the remaining hosts for this loop 34296 1726855347.66541: getting the next task for host managed_node1 34296 1726855347.66546: done getting next task for host managed_node1 34296 1726855347.66550: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34296 1726855347.66553: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.66568: getting variables 34296 1726855347.66569: in VariableManager get_vars() 34296 1726855347.66624: Calling all_inventory to load vars for managed_node1 34296 1726855347.66627: Calling groups_inventory to load vars for managed_node1 34296 1726855347.66629: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.66638: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.66640: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.66643: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.66835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.67067: done with get_vars() 34296 1726855347.67078: done getting variables 34296 1726855347.67145: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:02:27 -0400 (0:00:00.022) 0:00:03.709 ****** 34296 1726855347.67178: entering _queue_task() for managed_node1/fail 34296 1726855347.67497: worker is 1 (out of 1 available) 34296 1726855347.67511: exiting _queue_task() for managed_node1/fail 34296 1726855347.67523: done queuing things up, now waiting for results queue to drain 34296 1726855347.67524: waiting for pending results... 34296 1726855347.67771: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34296 1726855347.67910: in run() - task 0affcc66-ac2b-a97a-1acc-000000000066 34296 1726855347.67929: variable 'ansible_search_path' from source: unknown 34296 1726855347.67936: variable 'ansible_search_path' from source: unknown 34296 1726855347.67978: calling self._execute() 34296 1726855347.68064: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.68079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.68095: variable 'omit' from source: magic vars 34296 1726855347.68468: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.68490: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.68610: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.68620: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.68627: when evaluation is False, skipping this task 34296 1726855347.68633: _execute() done 34296 1726855347.68638: dumping result to json 34296 1726855347.68645: done dumping result, returning 34296 1726855347.68656: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-a97a-1acc-000000000066] 34296 1726855347.68669: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000066 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.68844: no more pending results, returning what we have 34296 1726855347.68848: results queue empty 34296 1726855347.68849: checking for any_errors_fatal 34296 1726855347.68855: done checking for any_errors_fatal 34296 1726855347.68856: checking for max_fail_percentage 34296 1726855347.68857: done checking for max_fail_percentage 34296 1726855347.68859: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.68859: done checking to see if all hosts have failed 34296 1726855347.68860: getting the remaining hosts for this loop 34296 1726855347.68861: done getting the remaining hosts for this loop 34296 1726855347.68868: getting the next task for host managed_node1 34296 1726855347.68874: done getting next task for host managed_node1 34296 1726855347.68879: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34296 1726855347.68882: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.68978: getting variables 34296 1726855347.68980: in VariableManager get_vars() 34296 1726855347.69075: Calling all_inventory to load vars for managed_node1 34296 1726855347.69078: Calling groups_inventory to load vars for managed_node1 34296 1726855347.69081: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.69086: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000066 34296 1726855347.69091: WORKER PROCESS EXITING 34296 1726855347.69099: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.69101: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.69104: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.69291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.69416: done with get_vars() 34296 1726855347.69424: done getting variables 34296 1726855347.69471: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:02:27 -0400 (0:00:00.023) 0:00:03.732 ****** 34296 1726855347.69497: entering _queue_task() for managed_node1/fail 34296 1726855347.69702: worker is 1 (out of 1 available) 34296 1726855347.69716: exiting _queue_task() for managed_node1/fail 34296 1726855347.69725: done queuing things up, now waiting for results queue to drain 34296 1726855347.69726: waiting for pending results... 34296 1726855347.69898: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34296 1726855347.69978: in run() - task 0affcc66-ac2b-a97a-1acc-000000000067 34296 1726855347.69990: variable 'ansible_search_path' from source: unknown 34296 1726855347.69994: variable 'ansible_search_path' from source: unknown 34296 1726855347.70021: calling self._execute() 34296 1726855347.70086: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.70091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.70100: variable 'omit' from source: magic vars 34296 1726855347.70368: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.70377: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.70456: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.70459: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.70462: when evaluation is False, skipping this task 34296 1726855347.70467: _execute() done 34296 1726855347.70471: dumping result to json 34296 1726855347.70473: done dumping result, returning 34296 1726855347.70479: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-a97a-1acc-000000000067] 34296 1726855347.70482: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000067 34296 1726855347.70570: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000067 34296 1726855347.70573: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.70647: no more pending results, returning what we have 34296 1726855347.70650: results queue empty 34296 1726855347.70651: checking for any_errors_fatal 34296 1726855347.70656: done checking for any_errors_fatal 34296 1726855347.70657: checking for max_fail_percentage 34296 1726855347.70658: done checking for max_fail_percentage 34296 1726855347.70659: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.70659: done checking to see if all hosts have failed 34296 1726855347.70660: getting the remaining hosts for this loop 34296 1726855347.70661: done getting the remaining hosts for this loop 34296 1726855347.70664: getting the next task for host managed_node1 34296 1726855347.70671: done getting next task for host managed_node1 34296 1726855347.70675: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34296 1726855347.70678: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.70695: getting variables 34296 1726855347.70696: in VariableManager get_vars() 34296 1726855347.70733: Calling all_inventory to load vars for managed_node1 34296 1726855347.70735: Calling groups_inventory to load vars for managed_node1 34296 1726855347.70738: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.70744: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.70746: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.70748: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.70873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.71073: done with get_vars() 34296 1726855347.71083: done getting variables 34296 1726855347.71138: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:02:27 -0400 (0:00:00.016) 0:00:03.749 ****** 34296 1726855347.71170: entering _queue_task() for managed_node1/dnf 34296 1726855347.71410: worker is 1 (out of 1 available) 34296 1726855347.71422: exiting _queue_task() for managed_node1/dnf 34296 1726855347.71432: done queuing things up, now waiting for results queue to drain 34296 1726855347.71434: waiting for pending results... 34296 1726855347.71696: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34296 1726855347.71831: in run() - task 0affcc66-ac2b-a97a-1acc-000000000068 34296 1726855347.71847: variable 'ansible_search_path' from source: unknown 34296 1726855347.71855: variable 'ansible_search_path' from source: unknown 34296 1726855347.71903: calling self._execute() 34296 1726855347.72012: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.72015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.72017: variable 'omit' from source: magic vars 34296 1726855347.72391: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.72410: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.72554: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.72558: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.72560: when evaluation is False, skipping this task 34296 1726855347.72562: _execute() done 34296 1726855347.72565: dumping result to json 34296 1726855347.72569: done dumping result, returning 34296 1726855347.72693: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-a97a-1acc-000000000068] 34296 1726855347.72697: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000068 34296 1726855347.72771: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000068 34296 1726855347.72774: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.72826: no more pending results, returning what we have 34296 1726855347.72829: results queue empty 34296 1726855347.72830: checking for any_errors_fatal 34296 1726855347.72835: done checking for any_errors_fatal 34296 1726855347.72836: checking for max_fail_percentage 34296 1726855347.72838: done checking for max_fail_percentage 34296 1726855347.72839: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.72840: done checking to see if all hosts have failed 34296 1726855347.72840: getting the remaining hosts for this loop 34296 1726855347.72842: done getting the remaining hosts for this loop 34296 1726855347.72845: getting the next task for host managed_node1 34296 1726855347.72852: done getting next task for host managed_node1 34296 1726855347.72856: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34296 1726855347.72859: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.72879: getting variables 34296 1726855347.72881: in VariableManager get_vars() 34296 1726855347.72930: Calling all_inventory to load vars for managed_node1 34296 1726855347.72933: Calling groups_inventory to load vars for managed_node1 34296 1726855347.72936: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.72947: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.72950: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.72953: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.73356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.73562: done with get_vars() 34296 1726855347.73576: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34296 1726855347.73651: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:02:27 -0400 (0:00:00.025) 0:00:03.774 ****** 34296 1726855347.73682: entering _queue_task() for managed_node1/yum 34296 1726855347.73960: worker is 1 (out of 1 available) 34296 1726855347.73974: exiting _queue_task() for managed_node1/yum 34296 1726855347.73986: done queuing things up, now waiting for results queue to drain 34296 1726855347.74193: waiting for pending results... 34296 1726855347.74321: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34296 1726855347.74386: in run() - task 0affcc66-ac2b-a97a-1acc-000000000069 34296 1726855347.74410: variable 'ansible_search_path' from source: unknown 34296 1726855347.74594: variable 'ansible_search_path' from source: unknown 34296 1726855347.74597: calling self._execute() 34296 1726855347.74600: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.74603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.74605: variable 'omit' from source: magic vars 34296 1726855347.74933: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.74951: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.75158: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.75162: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.75165: when evaluation is False, skipping this task 34296 1726855347.75170: _execute() done 34296 1726855347.75172: dumping result to json 34296 1726855347.75174: done dumping result, returning 34296 1726855347.75177: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-a97a-1acc-000000000069] 34296 1726855347.75179: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000069 34296 1726855347.75248: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000069 34296 1726855347.75251: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.75314: no more pending results, returning what we have 34296 1726855347.75318: results queue empty 34296 1726855347.75319: checking for any_errors_fatal 34296 1726855347.75325: done checking for any_errors_fatal 34296 1726855347.75326: checking for max_fail_percentage 34296 1726855347.75328: done checking for max_fail_percentage 34296 1726855347.75329: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.75329: done checking to see if all hosts have failed 34296 1726855347.75330: getting the remaining hosts for this loop 34296 1726855347.75332: done getting the remaining hosts for this loop 34296 1726855347.75335: getting the next task for host managed_node1 34296 1726855347.75341: done getting next task for host managed_node1 34296 1726855347.75345: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34296 1726855347.75348: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.75368: getting variables 34296 1726855347.75370: in VariableManager get_vars() 34296 1726855347.75419: Calling all_inventory to load vars for managed_node1 34296 1726855347.75422: Calling groups_inventory to load vars for managed_node1 34296 1726855347.75424: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.75435: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.75438: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.75441: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.75817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.76033: done with get_vars() 34296 1726855347.76043: done getting variables 34296 1726855347.76100: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:02:27 -0400 (0:00:00.024) 0:00:03.799 ****** 34296 1726855347.76130: entering _queue_task() for managed_node1/fail 34296 1726855347.76376: worker is 1 (out of 1 available) 34296 1726855347.76390: exiting _queue_task() for managed_node1/fail 34296 1726855347.76401: done queuing things up, now waiting for results queue to drain 34296 1726855347.76403: waiting for pending results... 34296 1726855347.76805: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34296 1726855347.76809: in run() - task 0affcc66-ac2b-a97a-1acc-00000000006a 34296 1726855347.76812: variable 'ansible_search_path' from source: unknown 34296 1726855347.76815: variable 'ansible_search_path' from source: unknown 34296 1726855347.76843: calling self._execute() 34296 1726855347.76936: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.76947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.76961: variable 'omit' from source: magic vars 34296 1726855347.77328: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.77350: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.77457: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.77470: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.77477: when evaluation is False, skipping this task 34296 1726855347.77483: _execute() done 34296 1726855347.77555: dumping result to json 34296 1726855347.77558: done dumping result, returning 34296 1726855347.77560: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-a97a-1acc-00000000006a] 34296 1726855347.77563: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000006a 34296 1726855347.77631: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000006a 34296 1726855347.77634: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.77708: no more pending results, returning what we have 34296 1726855347.77712: results queue empty 34296 1726855347.77713: checking for any_errors_fatal 34296 1726855347.77719: done checking for any_errors_fatal 34296 1726855347.77720: checking for max_fail_percentage 34296 1726855347.77721: done checking for max_fail_percentage 34296 1726855347.77722: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.77723: done checking to see if all hosts have failed 34296 1726855347.77723: getting the remaining hosts for this loop 34296 1726855347.77725: done getting the remaining hosts for this loop 34296 1726855347.77728: getting the next task for host managed_node1 34296 1726855347.77734: done getting next task for host managed_node1 34296 1726855347.77738: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34296 1726855347.77741: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.77758: getting variables 34296 1726855347.77760: in VariableManager get_vars() 34296 1726855347.77807: Calling all_inventory to load vars for managed_node1 34296 1726855347.77810: Calling groups_inventory to load vars for managed_node1 34296 1726855347.77812: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.77823: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.77825: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.77827: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.78215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.78404: done with get_vars() 34296 1726855347.78411: done getting variables 34296 1726855347.78468: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:02:27 -0400 (0:00:00.023) 0:00:03.822 ****** 34296 1726855347.78502: entering _queue_task() for managed_node1/package 34296 1726855347.78753: worker is 1 (out of 1 available) 34296 1726855347.78768: exiting _queue_task() for managed_node1/package 34296 1726855347.78778: done queuing things up, now waiting for results queue to drain 34296 1726855347.78780: waiting for pending results... 34296 1726855347.79206: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 34296 1726855347.79210: in run() - task 0affcc66-ac2b-a97a-1acc-00000000006b 34296 1726855347.79213: variable 'ansible_search_path' from source: unknown 34296 1726855347.79216: variable 'ansible_search_path' from source: unknown 34296 1726855347.79219: calling self._execute() 34296 1726855347.79308: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.79318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.79337: variable 'omit' from source: magic vars 34296 1726855347.79709: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.79729: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.79851: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.79863: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.79876: when evaluation is False, skipping this task 34296 1726855347.79891: _execute() done 34296 1726855347.79898: dumping result to json 34296 1726855347.79905: done dumping result, returning 34296 1726855347.79917: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-a97a-1acc-00000000006b] 34296 1726855347.79925: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000006b skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.80185: no more pending results, returning what we have 34296 1726855347.80191: results queue empty 34296 1726855347.80192: checking for any_errors_fatal 34296 1726855347.80199: done checking for any_errors_fatal 34296 1726855347.80200: checking for max_fail_percentage 34296 1726855347.80202: done checking for max_fail_percentage 34296 1726855347.80202: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.80203: done checking to see if all hosts have failed 34296 1726855347.80204: getting the remaining hosts for this loop 34296 1726855347.80205: done getting the remaining hosts for this loop 34296 1726855347.80209: getting the next task for host managed_node1 34296 1726855347.80217: done getting next task for host managed_node1 34296 1726855347.80220: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34296 1726855347.80223: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.80241: getting variables 34296 1726855347.80242: in VariableManager get_vars() 34296 1726855347.80462: Calling all_inventory to load vars for managed_node1 34296 1726855347.80467: Calling groups_inventory to load vars for managed_node1 34296 1726855347.80470: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.80476: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000006b 34296 1726855347.80479: WORKER PROCESS EXITING 34296 1726855347.80488: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.80491: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.80495: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.80672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.80889: done with get_vars() 34296 1726855347.80900: done getting variables 34296 1726855347.80956: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:02:27 -0400 (0:00:00.024) 0:00:03.847 ****** 34296 1726855347.80991: entering _queue_task() for managed_node1/package 34296 1726855347.81278: worker is 1 (out of 1 available) 34296 1726855347.81493: exiting _queue_task() for managed_node1/package 34296 1726855347.81502: done queuing things up, now waiting for results queue to drain 34296 1726855347.81503: waiting for pending results... 34296 1726855347.81574: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34296 1726855347.81704: in run() - task 0affcc66-ac2b-a97a-1acc-00000000006c 34296 1726855347.81730: variable 'ansible_search_path' from source: unknown 34296 1726855347.81739: variable 'ansible_search_path' from source: unknown 34296 1726855347.81783: calling self._execute() 34296 1726855347.81879: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.81893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.81907: variable 'omit' from source: magic vars 34296 1726855347.82261: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.82382: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.82415: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.82426: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.82434: when evaluation is False, skipping this task 34296 1726855347.82441: _execute() done 34296 1726855347.82447: dumping result to json 34296 1726855347.82455: done dumping result, returning 34296 1726855347.82470: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-a97a-1acc-00000000006c] 34296 1726855347.82481: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000006c 34296 1726855347.82808: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000006c 34296 1726855347.82811: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.82852: no more pending results, returning what we have 34296 1726855347.82856: results queue empty 34296 1726855347.82857: checking for any_errors_fatal 34296 1726855347.82862: done checking for any_errors_fatal 34296 1726855347.82863: checking for max_fail_percentage 34296 1726855347.82865: done checking for max_fail_percentage 34296 1726855347.82868: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.82869: done checking to see if all hosts have failed 34296 1726855347.82870: getting the remaining hosts for this loop 34296 1726855347.82871: done getting the remaining hosts for this loop 34296 1726855347.82875: getting the next task for host managed_node1 34296 1726855347.82881: done getting next task for host managed_node1 34296 1726855347.82884: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34296 1726855347.82890: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.82906: getting variables 34296 1726855347.82908: in VariableManager get_vars() 34296 1726855347.82954: Calling all_inventory to load vars for managed_node1 34296 1726855347.82957: Calling groups_inventory to load vars for managed_node1 34296 1726855347.82959: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.82972: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.82975: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.82978: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.83299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.83502: done with get_vars() 34296 1726855347.83512: done getting variables 34296 1726855347.83571: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:02:27 -0400 (0:00:00.026) 0:00:03.873 ****** 34296 1726855347.83604: entering _queue_task() for managed_node1/package 34296 1726855347.83878: worker is 1 (out of 1 available) 34296 1726855347.83995: exiting _queue_task() for managed_node1/package 34296 1726855347.84006: done queuing things up, now waiting for results queue to drain 34296 1726855347.84007: waiting for pending results... 34296 1726855347.84175: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34296 1726855347.84315: in run() - task 0affcc66-ac2b-a97a-1acc-00000000006d 34296 1726855347.84337: variable 'ansible_search_path' from source: unknown 34296 1726855347.84348: variable 'ansible_search_path' from source: unknown 34296 1726855347.84395: calling self._execute() 34296 1726855347.84486: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.84500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.84515: variable 'omit' from source: magic vars 34296 1726855347.84875: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.84899: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.85020: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.85029: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.85035: when evaluation is False, skipping this task 34296 1726855347.85041: _execute() done 34296 1726855347.85046: dumping result to json 34296 1726855347.85052: done dumping result, returning 34296 1726855347.85064: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-a97a-1acc-00000000006d] 34296 1726855347.85102: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000006d skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.85424: no more pending results, returning what we have 34296 1726855347.85428: results queue empty 34296 1726855347.85429: checking for any_errors_fatal 34296 1726855347.85434: done checking for any_errors_fatal 34296 1726855347.85435: checking for max_fail_percentage 34296 1726855347.85437: done checking for max_fail_percentage 34296 1726855347.85437: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.85438: done checking to see if all hosts have failed 34296 1726855347.85439: getting the remaining hosts for this loop 34296 1726855347.85440: done getting the remaining hosts for this loop 34296 1726855347.85444: getting the next task for host managed_node1 34296 1726855347.85449: done getting next task for host managed_node1 34296 1726855347.85453: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34296 1726855347.85456: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.85474: getting variables 34296 1726855347.85475: in VariableManager get_vars() 34296 1726855347.85518: Calling all_inventory to load vars for managed_node1 34296 1726855347.85520: Calling groups_inventory to load vars for managed_node1 34296 1726855347.85522: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.85531: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.85534: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.85537: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.85748: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000006d 34296 1726855347.85751: WORKER PROCESS EXITING 34296 1726855347.85778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.85992: done with get_vars() 34296 1726855347.86003: done getting variables 34296 1726855347.86058: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:02:27 -0400 (0:00:00.024) 0:00:03.898 ****** 34296 1726855347.86095: entering _queue_task() for managed_node1/service 34296 1726855347.86375: worker is 1 (out of 1 available) 34296 1726855347.86389: exiting _queue_task() for managed_node1/service 34296 1726855347.86401: done queuing things up, now waiting for results queue to drain 34296 1726855347.86403: waiting for pending results... 34296 1726855347.86671: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34296 1726855347.86798: in run() - task 0affcc66-ac2b-a97a-1acc-00000000006e 34296 1726855347.86815: variable 'ansible_search_path' from source: unknown 34296 1726855347.86822: variable 'ansible_search_path' from source: unknown 34296 1726855347.86859: calling self._execute() 34296 1726855347.86949: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.86960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.86976: variable 'omit' from source: magic vars 34296 1726855347.87419: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.87439: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.87559: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.87572: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.87579: when evaluation is False, skipping this task 34296 1726855347.87585: _execute() done 34296 1726855347.87593: dumping result to json 34296 1726855347.87601: done dumping result, returning 34296 1726855347.87613: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-a97a-1acc-00000000006e] 34296 1726855347.87621: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000006e 34296 1726855347.87729: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000006e 34296 1726855347.87738: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.87803: no more pending results, returning what we have 34296 1726855347.87807: results queue empty 34296 1726855347.87808: checking for any_errors_fatal 34296 1726855347.87814: done checking for any_errors_fatal 34296 1726855347.87815: checking for max_fail_percentage 34296 1726855347.87817: done checking for max_fail_percentage 34296 1726855347.87817: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.87818: done checking to see if all hosts have failed 34296 1726855347.87819: getting the remaining hosts for this loop 34296 1726855347.87820: done getting the remaining hosts for this loop 34296 1726855347.87824: getting the next task for host managed_node1 34296 1726855347.87830: done getting next task for host managed_node1 34296 1726855347.87834: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34296 1726855347.87837: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.87855: getting variables 34296 1726855347.87857: in VariableManager get_vars() 34296 1726855347.87907: Calling all_inventory to load vars for managed_node1 34296 1726855347.87910: Calling groups_inventory to load vars for managed_node1 34296 1726855347.87912: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.87923: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.87926: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.87929: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.88378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.88569: done with get_vars() 34296 1726855347.88579: done getting variables 34296 1726855347.88628: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:02:27 -0400 (0:00:00.025) 0:00:03.924 ****** 34296 1726855347.88652: entering _queue_task() for managed_node1/service 34296 1726855347.88890: worker is 1 (out of 1 available) 34296 1726855347.88903: exiting _queue_task() for managed_node1/service 34296 1726855347.88915: done queuing things up, now waiting for results queue to drain 34296 1726855347.88916: waiting for pending results... 34296 1726855347.89307: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34296 1726855347.89322: in run() - task 0affcc66-ac2b-a97a-1acc-00000000006f 34296 1726855347.89340: variable 'ansible_search_path' from source: unknown 34296 1726855347.89348: variable 'ansible_search_path' from source: unknown 34296 1726855347.89391: calling self._execute() 34296 1726855347.89484: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.89499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.89519: variable 'omit' from source: magic vars 34296 1726855347.89906: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.89925: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.90045: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.90062: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.90074: when evaluation is False, skipping this task 34296 1726855347.90082: _execute() done 34296 1726855347.90091: dumping result to json 34296 1726855347.90100: done dumping result, returning 34296 1726855347.90111: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-a97a-1acc-00000000006f] 34296 1726855347.90120: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000006f 34296 1726855347.90394: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000006f 34296 1726855347.90397: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34296 1726855347.90435: no more pending results, returning what we have 34296 1726855347.90438: results queue empty 34296 1726855347.90439: checking for any_errors_fatal 34296 1726855347.90443: done checking for any_errors_fatal 34296 1726855347.90444: checking for max_fail_percentage 34296 1726855347.90445: done checking for max_fail_percentage 34296 1726855347.90446: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.90446: done checking to see if all hosts have failed 34296 1726855347.90447: getting the remaining hosts for this loop 34296 1726855347.90449: done getting the remaining hosts for this loop 34296 1726855347.90452: getting the next task for host managed_node1 34296 1726855347.90458: done getting next task for host managed_node1 34296 1726855347.90462: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34296 1726855347.90467: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.90484: getting variables 34296 1726855347.90486: in VariableManager get_vars() 34296 1726855347.90534: Calling all_inventory to load vars for managed_node1 34296 1726855347.90537: Calling groups_inventory to load vars for managed_node1 34296 1726855347.90540: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.90550: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.90553: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.90556: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.90809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.91006: done with get_vars() 34296 1726855347.91016: done getting variables 34296 1726855347.91069: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:02:27 -0400 (0:00:00.024) 0:00:03.948 ****** 34296 1726855347.91100: entering _queue_task() for managed_node1/service 34296 1726855347.91345: worker is 1 (out of 1 available) 34296 1726855347.91357: exiting _queue_task() for managed_node1/service 34296 1726855347.91372: done queuing things up, now waiting for results queue to drain 34296 1726855347.91373: waiting for pending results... 34296 1726855347.91628: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34296 1726855347.91751: in run() - task 0affcc66-ac2b-a97a-1acc-000000000070 34296 1726855347.91773: variable 'ansible_search_path' from source: unknown 34296 1726855347.91782: variable 'ansible_search_path' from source: unknown 34296 1726855347.91823: calling self._execute() 34296 1726855347.91909: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.91925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.91939: variable 'omit' from source: magic vars 34296 1726855347.92404: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.92421: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.92540: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.92553: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.92573: when evaluation is False, skipping this task 34296 1726855347.92576: _execute() done 34296 1726855347.92578: dumping result to json 34296 1726855347.92792: done dumping result, returning 34296 1726855347.92796: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-a97a-1acc-000000000070] 34296 1726855347.92798: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000070 34296 1726855347.92862: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000070 34296 1726855347.92864: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.92909: no more pending results, returning what we have 34296 1726855347.92912: results queue empty 34296 1726855347.92913: checking for any_errors_fatal 34296 1726855347.92918: done checking for any_errors_fatal 34296 1726855347.92919: checking for max_fail_percentage 34296 1726855347.92921: done checking for max_fail_percentage 34296 1726855347.92921: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.92922: done checking to see if all hosts have failed 34296 1726855347.92923: getting the remaining hosts for this loop 34296 1726855347.92924: done getting the remaining hosts for this loop 34296 1726855347.92927: getting the next task for host managed_node1 34296 1726855347.92932: done getting next task for host managed_node1 34296 1726855347.92936: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34296 1726855347.92939: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.92956: getting variables 34296 1726855347.92958: in VariableManager get_vars() 34296 1726855347.93006: Calling all_inventory to load vars for managed_node1 34296 1726855347.93009: Calling groups_inventory to load vars for managed_node1 34296 1726855347.93011: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.93021: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.93024: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.93026: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.93332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.93544: done with get_vars() 34296 1726855347.93554: done getting variables 34296 1726855347.93613: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:02:27 -0400 (0:00:00.025) 0:00:03.974 ****** 34296 1726855347.93642: entering _queue_task() for managed_node1/service 34296 1726855347.93902: worker is 1 (out of 1 available) 34296 1726855347.93914: exiting _queue_task() for managed_node1/service 34296 1726855347.93926: done queuing things up, now waiting for results queue to drain 34296 1726855347.93927: waiting for pending results... 34296 1726855347.94184: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 34296 1726855347.94311: in run() - task 0affcc66-ac2b-a97a-1acc-000000000071 34296 1726855347.94330: variable 'ansible_search_path' from source: unknown 34296 1726855347.94338: variable 'ansible_search_path' from source: unknown 34296 1726855347.94380: calling self._execute() 34296 1726855347.94470: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.94519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.94522: variable 'omit' from source: magic vars 34296 1726855347.94855: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.94877: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.95009: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.95061: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.95064: when evaluation is False, skipping this task 34296 1726855347.95069: _execute() done 34296 1726855347.95072: dumping result to json 34296 1726855347.95074: done dumping result, returning 34296 1726855347.95076: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-a97a-1acc-000000000071] 34296 1726855347.95079: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000071 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34296 1726855347.95423: no more pending results, returning what we have 34296 1726855347.95426: results queue empty 34296 1726855347.95427: checking for any_errors_fatal 34296 1726855347.95432: done checking for any_errors_fatal 34296 1726855347.95433: checking for max_fail_percentage 34296 1726855347.95434: done checking for max_fail_percentage 34296 1726855347.95435: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.95436: done checking to see if all hosts have failed 34296 1726855347.95437: getting the remaining hosts for this loop 34296 1726855347.95438: done getting the remaining hosts for this loop 34296 1726855347.95442: getting the next task for host managed_node1 34296 1726855347.95448: done getting next task for host managed_node1 34296 1726855347.95451: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34296 1726855347.95456: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.95476: getting variables 34296 1726855347.95477: in VariableManager get_vars() 34296 1726855347.95520: Calling all_inventory to load vars for managed_node1 34296 1726855347.95523: Calling groups_inventory to load vars for managed_node1 34296 1726855347.95526: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.95535: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.95537: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.95541: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.95748: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000071 34296 1726855347.95752: WORKER PROCESS EXITING 34296 1726855347.95779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.95992: done with get_vars() 34296 1726855347.96002: done getting variables 34296 1726855347.96060: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:02:27 -0400 (0:00:00.024) 0:00:03.998 ****** 34296 1726855347.96097: entering _queue_task() for managed_node1/copy 34296 1726855347.96377: worker is 1 (out of 1 available) 34296 1726855347.96495: exiting _queue_task() for managed_node1/copy 34296 1726855347.96507: done queuing things up, now waiting for results queue to drain 34296 1726855347.96508: waiting for pending results... 34296 1726855347.96683: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34296 1726855347.96803: in run() - task 0affcc66-ac2b-a97a-1acc-000000000072 34296 1726855347.96822: variable 'ansible_search_path' from source: unknown 34296 1726855347.96830: variable 'ansible_search_path' from source: unknown 34296 1726855347.96875: calling self._execute() 34296 1726855347.96960: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.96972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.96984: variable 'omit' from source: magic vars 34296 1726855347.97430: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.97447: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855347.97563: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.97579: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855347.97586: when evaluation is False, skipping this task 34296 1726855347.97595: _execute() done 34296 1726855347.97602: dumping result to json 34296 1726855347.97609: done dumping result, returning 34296 1726855347.97621: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-a97a-1acc-000000000072] 34296 1726855347.97630: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000072 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855347.97778: no more pending results, returning what we have 34296 1726855347.97782: results queue empty 34296 1726855347.97783: checking for any_errors_fatal 34296 1726855347.97792: done checking for any_errors_fatal 34296 1726855347.97793: checking for max_fail_percentage 34296 1726855347.97795: done checking for max_fail_percentage 34296 1726855347.97795: checking to see if all hosts have failed and the running result is not ok 34296 1726855347.97796: done checking to see if all hosts have failed 34296 1726855347.97797: getting the remaining hosts for this loop 34296 1726855347.97798: done getting the remaining hosts for this loop 34296 1726855347.97802: getting the next task for host managed_node1 34296 1726855347.97809: done getting next task for host managed_node1 34296 1726855347.97813: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34296 1726855347.97816: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855347.97833: getting variables 34296 1726855347.97835: in VariableManager get_vars() 34296 1726855347.98056: Calling all_inventory to load vars for managed_node1 34296 1726855347.98059: Calling groups_inventory to load vars for managed_node1 34296 1726855347.98062: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855347.98078: Calling all_plugins_play to load vars for managed_node1 34296 1726855347.98080: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855347.98083: Calling groups_plugins_play to load vars for managed_node1 34296 1726855347.98416: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000072 34296 1726855347.98419: WORKER PROCESS EXITING 34296 1726855347.98440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855347.98648: done with get_vars() 34296 1726855347.98659: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:02:27 -0400 (0:00:00.026) 0:00:04.025 ****** 34296 1726855347.98742: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34296 1726855347.99000: worker is 1 (out of 1 available) 34296 1726855347.99014: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34296 1726855347.99024: done queuing things up, now waiting for results queue to drain 34296 1726855347.99025: waiting for pending results... 34296 1726855347.99281: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34296 1726855347.99413: in run() - task 0affcc66-ac2b-a97a-1acc-000000000073 34296 1726855347.99433: variable 'ansible_search_path' from source: unknown 34296 1726855347.99441: variable 'ansible_search_path' from source: unknown 34296 1726855347.99482: calling self._execute() 34296 1726855347.99564: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855347.99578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855347.99594: variable 'omit' from source: magic vars 34296 1726855347.99963: variable 'ansible_distribution_major_version' from source: facts 34296 1726855347.99986: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.00104: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.00116: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.00124: when evaluation is False, skipping this task 34296 1726855348.00131: _execute() done 34296 1726855348.00138: dumping result to json 34296 1726855348.00145: done dumping result, returning 34296 1726855348.00161: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-a97a-1acc-000000000073] 34296 1726855348.00175: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000073 34296 1726855348.00415: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000073 34296 1726855348.00418: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.00459: no more pending results, returning what we have 34296 1726855348.00462: results queue empty 34296 1726855348.00463: checking for any_errors_fatal 34296 1726855348.00470: done checking for any_errors_fatal 34296 1726855348.00471: checking for max_fail_percentage 34296 1726855348.00472: done checking for max_fail_percentage 34296 1726855348.00472: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.00473: done checking to see if all hosts have failed 34296 1726855348.00474: getting the remaining hosts for this loop 34296 1726855348.00475: done getting the remaining hosts for this loop 34296 1726855348.00478: getting the next task for host managed_node1 34296 1726855348.00484: done getting next task for host managed_node1 34296 1726855348.00489: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34296 1726855348.00492: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.00507: getting variables 34296 1726855348.00508: in VariableManager get_vars() 34296 1726855348.00559: Calling all_inventory to load vars for managed_node1 34296 1726855348.00563: Calling groups_inventory to load vars for managed_node1 34296 1726855348.00568: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.00579: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.00582: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.00585: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.00864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.01100: done with get_vars() 34296 1726855348.01110: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:02:28 -0400 (0:00:00.024) 0:00:04.049 ****** 34296 1726855348.01198: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34296 1726855348.01441: worker is 1 (out of 1 available) 34296 1726855348.01455: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34296 1726855348.01470: done queuing things up, now waiting for results queue to drain 34296 1726855348.01472: waiting for pending results... 34296 1726855348.01738: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 34296 1726855348.01863: in run() - task 0affcc66-ac2b-a97a-1acc-000000000074 34296 1726855348.01886: variable 'ansible_search_path' from source: unknown 34296 1726855348.01897: variable 'ansible_search_path' from source: unknown 34296 1726855348.01939: calling self._execute() 34296 1726855348.02029: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.02040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.02053: variable 'omit' from source: magic vars 34296 1726855348.02422: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.02441: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.02594: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.02597: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.02600: when evaluation is False, skipping this task 34296 1726855348.02603: _execute() done 34296 1726855348.02605: dumping result to json 34296 1726855348.02607: done dumping result, returning 34296 1726855348.02609: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-a97a-1acc-000000000074] 34296 1726855348.02611: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000074 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.02840: no more pending results, returning what we have 34296 1726855348.02843: results queue empty 34296 1726855348.02845: checking for any_errors_fatal 34296 1726855348.02854: done checking for any_errors_fatal 34296 1726855348.02855: checking for max_fail_percentage 34296 1726855348.02857: done checking for max_fail_percentage 34296 1726855348.02858: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.02859: done checking to see if all hosts have failed 34296 1726855348.02860: getting the remaining hosts for this loop 34296 1726855348.02861: done getting the remaining hosts for this loop 34296 1726855348.02868: getting the next task for host managed_node1 34296 1726855348.02875: done getting next task for host managed_node1 34296 1726855348.02878: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34296 1726855348.02882: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.02902: getting variables 34296 1726855348.02904: in VariableManager get_vars() 34296 1726855348.02951: Calling all_inventory to load vars for managed_node1 34296 1726855348.02954: Calling groups_inventory to load vars for managed_node1 34296 1726855348.02957: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.02971: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.02975: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.02979: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.03291: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000074 34296 1726855348.03296: WORKER PROCESS EXITING 34296 1726855348.03320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.03527: done with get_vars() 34296 1726855348.03537: done getting variables 34296 1726855348.03600: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:02:28 -0400 (0:00:00.024) 0:00:04.074 ****** 34296 1726855348.03633: entering _queue_task() for managed_node1/debug 34296 1726855348.03881: worker is 1 (out of 1 available) 34296 1726855348.04093: exiting _queue_task() for managed_node1/debug 34296 1726855348.04102: done queuing things up, now waiting for results queue to drain 34296 1726855348.04104: waiting for pending results... 34296 1726855348.04162: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34296 1726855348.04289: in run() - task 0affcc66-ac2b-a97a-1acc-000000000075 34296 1726855348.04309: variable 'ansible_search_path' from source: unknown 34296 1726855348.04315: variable 'ansible_search_path' from source: unknown 34296 1726855348.04356: calling self._execute() 34296 1726855348.04448: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.04460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.04479: variable 'omit' from source: magic vars 34296 1726855348.04856: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.04880: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.05001: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.05012: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.05020: when evaluation is False, skipping this task 34296 1726855348.05027: _execute() done 34296 1726855348.05035: dumping result to json 34296 1726855348.05043: done dumping result, returning 34296 1726855348.05056: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-a97a-1acc-000000000075] 34296 1726855348.05070: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000075 skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855348.05242: no more pending results, returning what we have 34296 1726855348.05246: results queue empty 34296 1726855348.05247: checking for any_errors_fatal 34296 1726855348.05252: done checking for any_errors_fatal 34296 1726855348.05253: checking for max_fail_percentage 34296 1726855348.05255: done checking for max_fail_percentage 34296 1726855348.05256: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.05257: done checking to see if all hosts have failed 34296 1726855348.05257: getting the remaining hosts for this loop 34296 1726855348.05259: done getting the remaining hosts for this loop 34296 1726855348.05263: getting the next task for host managed_node1 34296 1726855348.05272: done getting next task for host managed_node1 34296 1726855348.05277: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34296 1726855348.05280: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.05300: getting variables 34296 1726855348.05302: in VariableManager get_vars() 34296 1726855348.05352: Calling all_inventory to load vars for managed_node1 34296 1726855348.05355: Calling groups_inventory to load vars for managed_node1 34296 1726855348.05357: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.05371: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.05376: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.05380: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.05658: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000075 34296 1726855348.05662: WORKER PROCESS EXITING 34296 1726855348.05884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.06085: done with get_vars() 34296 1726855348.06099: done getting variables 34296 1726855348.06156: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:02:28 -0400 (0:00:00.025) 0:00:04.099 ****** 34296 1726855348.06193: entering _queue_task() for managed_node1/debug 34296 1726855348.06474: worker is 1 (out of 1 available) 34296 1726855348.06591: exiting _queue_task() for managed_node1/debug 34296 1726855348.06602: done queuing things up, now waiting for results queue to drain 34296 1726855348.06604: waiting for pending results... 34296 1726855348.06772: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34296 1726855348.06894: in run() - task 0affcc66-ac2b-a97a-1acc-000000000076 34296 1726855348.06914: variable 'ansible_search_path' from source: unknown 34296 1726855348.06922: variable 'ansible_search_path' from source: unknown 34296 1726855348.06969: calling self._execute() 34296 1726855348.07060: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.07074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.07088: variable 'omit' from source: magic vars 34296 1726855348.07461: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.07485: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.07607: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.07617: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.07624: when evaluation is False, skipping this task 34296 1726855348.07631: _execute() done 34296 1726855348.07637: dumping result to json 34296 1726855348.07644: done dumping result, returning 34296 1726855348.07657: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-a97a-1acc-000000000076] 34296 1726855348.07670: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000076 skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855348.07817: no more pending results, returning what we have 34296 1726855348.07821: results queue empty 34296 1726855348.07822: checking for any_errors_fatal 34296 1726855348.07828: done checking for any_errors_fatal 34296 1726855348.07829: checking for max_fail_percentage 34296 1726855348.07831: done checking for max_fail_percentage 34296 1726855348.07832: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.07833: done checking to see if all hosts have failed 34296 1726855348.07834: getting the remaining hosts for this loop 34296 1726855348.07835: done getting the remaining hosts for this loop 34296 1726855348.07839: getting the next task for host managed_node1 34296 1726855348.07846: done getting next task for host managed_node1 34296 1726855348.07850: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34296 1726855348.07854: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.07874: getting variables 34296 1726855348.07876: in VariableManager get_vars() 34296 1726855348.07925: Calling all_inventory to load vars for managed_node1 34296 1726855348.07928: Calling groups_inventory to load vars for managed_node1 34296 1726855348.07930: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.07940: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.07942: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.07944: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.08334: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000076 34296 1726855348.08337: WORKER PROCESS EXITING 34296 1726855348.08356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.08557: done with get_vars() 34296 1726855348.08569: done getting variables 34296 1726855348.08626: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:02:28 -0400 (0:00:00.024) 0:00:04.124 ****** 34296 1726855348.08658: entering _queue_task() for managed_node1/debug 34296 1726855348.08900: worker is 1 (out of 1 available) 34296 1726855348.08912: exiting _queue_task() for managed_node1/debug 34296 1726855348.08921: done queuing things up, now waiting for results queue to drain 34296 1726855348.08922: waiting for pending results... 34296 1726855348.09305: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34296 1726855348.09315: in run() - task 0affcc66-ac2b-a97a-1acc-000000000077 34296 1726855348.09333: variable 'ansible_search_path' from source: unknown 34296 1726855348.09339: variable 'ansible_search_path' from source: unknown 34296 1726855348.09378: calling self._execute() 34296 1726855348.09464: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.09480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.09496: variable 'omit' from source: magic vars 34296 1726855348.09860: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.09879: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.10001: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.10012: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.10019: when evaluation is False, skipping this task 34296 1726855348.10025: _execute() done 34296 1726855348.10031: dumping result to json 34296 1726855348.10038: done dumping result, returning 34296 1726855348.10051: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-a97a-1acc-000000000077] 34296 1726855348.10063: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000077 skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855348.10208: no more pending results, returning what we have 34296 1726855348.10212: results queue empty 34296 1726855348.10213: checking for any_errors_fatal 34296 1726855348.10219: done checking for any_errors_fatal 34296 1726855348.10220: checking for max_fail_percentage 34296 1726855348.10221: done checking for max_fail_percentage 34296 1726855348.10223: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.10224: done checking to see if all hosts have failed 34296 1726855348.10224: getting the remaining hosts for this loop 34296 1726855348.10226: done getting the remaining hosts for this loop 34296 1726855348.10229: getting the next task for host managed_node1 34296 1726855348.10236: done getting next task for host managed_node1 34296 1726855348.10239: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34296 1726855348.10243: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.10262: getting variables 34296 1726855348.10264: in VariableManager get_vars() 34296 1726855348.10315: Calling all_inventory to load vars for managed_node1 34296 1726855348.10317: Calling groups_inventory to load vars for managed_node1 34296 1726855348.10320: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.10331: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.10333: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.10336: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.10789: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000077 34296 1726855348.10793: WORKER PROCESS EXITING 34296 1726855348.10815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.11024: done with get_vars() 34296 1726855348.11033: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:02:28 -0400 (0:00:00.024) 0:00:04.148 ****** 34296 1726855348.11124: entering _queue_task() for managed_node1/ping 34296 1726855348.11370: worker is 1 (out of 1 available) 34296 1726855348.11384: exiting _queue_task() for managed_node1/ping 34296 1726855348.11395: done queuing things up, now waiting for results queue to drain 34296 1726855348.11397: waiting for pending results... 34296 1726855348.11659: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 34296 1726855348.11781: in run() - task 0affcc66-ac2b-a97a-1acc-000000000078 34296 1726855348.11803: variable 'ansible_search_path' from source: unknown 34296 1726855348.11813: variable 'ansible_search_path' from source: unknown 34296 1726855348.11848: calling self._execute() 34296 1726855348.11930: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.11938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.11950: variable 'omit' from source: magic vars 34296 1726855348.12334: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.12357: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.12485: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.12498: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.12507: when evaluation is False, skipping this task 34296 1726855348.12514: _execute() done 34296 1726855348.12522: dumping result to json 34296 1726855348.12529: done dumping result, returning 34296 1726855348.12569: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-a97a-1acc-000000000078] 34296 1726855348.12573: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000078 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.12926: no more pending results, returning what we have 34296 1726855348.12929: results queue empty 34296 1726855348.12930: checking for any_errors_fatal 34296 1726855348.12935: done checking for any_errors_fatal 34296 1726855348.12936: checking for max_fail_percentage 34296 1726855348.12938: done checking for max_fail_percentage 34296 1726855348.12939: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.12940: done checking to see if all hosts have failed 34296 1726855348.12940: getting the remaining hosts for this loop 34296 1726855348.12942: done getting the remaining hosts for this loop 34296 1726855348.12945: getting the next task for host managed_node1 34296 1726855348.12952: done getting next task for host managed_node1 34296 1726855348.12954: ^ task is: TASK: meta (role_complete) 34296 1726855348.12957: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.12977: getting variables 34296 1726855348.12979: in VariableManager get_vars() 34296 1726855348.13023: Calling all_inventory to load vars for managed_node1 34296 1726855348.13026: Calling groups_inventory to load vars for managed_node1 34296 1726855348.13029: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.13037: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.13040: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.13043: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.13216: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000078 34296 1726855348.13220: WORKER PROCESS EXITING 34296 1726855348.13243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.13446: done with get_vars() 34296 1726855348.13457: done getting variables 34296 1726855348.13541: done queuing things up, now waiting for results queue to drain 34296 1726855348.13543: results queue empty 34296 1726855348.13544: checking for any_errors_fatal 34296 1726855348.13546: done checking for any_errors_fatal 34296 1726855348.13547: checking for max_fail_percentage 34296 1726855348.13547: done checking for max_fail_percentage 34296 1726855348.13548: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.13549: done checking to see if all hosts have failed 34296 1726855348.13550: getting the remaining hosts for this loop 34296 1726855348.13550: done getting the remaining hosts for this loop 34296 1726855348.13553: getting the next task for host managed_node1 34296 1726855348.13557: done getting next task for host managed_node1 34296 1726855348.13559: ^ task is: TASK: TEST: wireless connection with 802.1x TLS-EAP 34296 1726855348.13561: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.13563: getting variables 34296 1726855348.13564: in VariableManager get_vars() 34296 1726855348.13584: Calling all_inventory to load vars for managed_node1 34296 1726855348.13586: Calling groups_inventory to load vars for managed_node1 34296 1726855348.13589: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.13594: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.13596: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.13599: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.13734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.13949: done with get_vars() 34296 1726855348.13958: done getting variables 34296 1726855348.14002: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: wireless connection with 802.1x TLS-EAP] *************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:53 Friday 20 September 2024 14:02:28 -0400 (0:00:00.029) 0:00:04.178 ****** 34296 1726855348.14029: entering _queue_task() for managed_node1/debug 34296 1726855348.14522: worker is 1 (out of 1 available) 34296 1726855348.14532: exiting _queue_task() for managed_node1/debug 34296 1726855348.14541: done queuing things up, now waiting for results queue to drain 34296 1726855348.14543: waiting for pending results... 34296 1726855348.14633: running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with 802.1x TLS-EAP 34296 1726855348.14739: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000a8 34296 1726855348.14760: variable 'ansible_search_path' from source: unknown 34296 1726855348.14807: calling self._execute() 34296 1726855348.14895: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.14905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.14918: variable 'omit' from source: magic vars 34296 1726855348.15263: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.15283: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.15402: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.15413: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.15421: when evaluation is False, skipping this task 34296 1726855348.15426: _execute() done 34296 1726855348.15432: dumping result to json 34296 1726855348.15437: done dumping result, returning 34296 1726855348.15446: done running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with 802.1x TLS-EAP [0affcc66-ac2b-a97a-1acc-0000000000a8] 34296 1726855348.15454: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000a8 skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855348.15602: no more pending results, returning what we have 34296 1726855348.15606: results queue empty 34296 1726855348.15607: checking for any_errors_fatal 34296 1726855348.15609: done checking for any_errors_fatal 34296 1726855348.15609: checking for max_fail_percentage 34296 1726855348.15611: done checking for max_fail_percentage 34296 1726855348.15612: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.15613: done checking to see if all hosts have failed 34296 1726855348.15614: getting the remaining hosts for this loop 34296 1726855348.15615: done getting the remaining hosts for this loop 34296 1726855348.15619: getting the next task for host managed_node1 34296 1726855348.15626: done getting next task for host managed_node1 34296 1726855348.15632: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34296 1726855348.15636: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.15656: getting variables 34296 1726855348.15658: in VariableManager get_vars() 34296 1726855348.15711: Calling all_inventory to load vars for managed_node1 34296 1726855348.15714: Calling groups_inventory to load vars for managed_node1 34296 1726855348.15717: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.15728: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.15731: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.15735: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.16178: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000a8 34296 1726855348.16182: WORKER PROCESS EXITING 34296 1726855348.16206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.16413: done with get_vars() 34296 1726855348.16424: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:02:28 -0400 (0:00:00.024) 0:00:04.202 ****** 34296 1726855348.16519: entering _queue_task() for managed_node1/include_tasks 34296 1726855348.16760: worker is 1 (out of 1 available) 34296 1726855348.16774: exiting _queue_task() for managed_node1/include_tasks 34296 1726855348.16786: done queuing things up, now waiting for results queue to drain 34296 1726855348.16891: waiting for pending results... 34296 1726855348.17052: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34296 1726855348.17189: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000b0 34296 1726855348.17208: variable 'ansible_search_path' from source: unknown 34296 1726855348.17217: variable 'ansible_search_path' from source: unknown 34296 1726855348.17259: calling self._execute() 34296 1726855348.17349: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.17361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.17380: variable 'omit' from source: magic vars 34296 1726855348.17746: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.17768: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.17890: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.17902: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.17910: when evaluation is False, skipping this task 34296 1726855348.17918: _execute() done 34296 1726855348.17925: dumping result to json 34296 1726855348.17933: done dumping result, returning 34296 1726855348.17994: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-a97a-1acc-0000000000b0] 34296 1726855348.17997: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b0 34296 1726855348.18068: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b0 34296 1726855348.18072: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.18142: no more pending results, returning what we have 34296 1726855348.18147: results queue empty 34296 1726855348.18148: checking for any_errors_fatal 34296 1726855348.18156: done checking for any_errors_fatal 34296 1726855348.18157: checking for max_fail_percentage 34296 1726855348.18158: done checking for max_fail_percentage 34296 1726855348.18159: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.18160: done checking to see if all hosts have failed 34296 1726855348.18161: getting the remaining hosts for this loop 34296 1726855348.18162: done getting the remaining hosts for this loop 34296 1726855348.18169: getting the next task for host managed_node1 34296 1726855348.18176: done getting next task for host managed_node1 34296 1726855348.18180: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34296 1726855348.18184: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.18206: getting variables 34296 1726855348.18208: in VariableManager get_vars() 34296 1726855348.18255: Calling all_inventory to load vars for managed_node1 34296 1726855348.18258: Calling groups_inventory to load vars for managed_node1 34296 1726855348.18261: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.18275: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.18278: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.18281: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.18896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.19077: done with get_vars() 34296 1726855348.19086: done getting variables 34296 1726855348.19143: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:02:28 -0400 (0:00:00.026) 0:00:04.229 ****** 34296 1726855348.19176: entering _queue_task() for managed_node1/debug 34296 1726855348.19614: worker is 1 (out of 1 available) 34296 1726855348.19625: exiting _queue_task() for managed_node1/debug 34296 1726855348.19633: done queuing things up, now waiting for results queue to drain 34296 1726855348.19635: waiting for pending results... 34296 1726855348.19763: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 34296 1726855348.19970: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000b1 34296 1726855348.19974: variable 'ansible_search_path' from source: unknown 34296 1726855348.19976: variable 'ansible_search_path' from source: unknown 34296 1726855348.19979: calling self._execute() 34296 1726855348.20012: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.20023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.20037: variable 'omit' from source: magic vars 34296 1726855348.20424: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.20441: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.20565: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.20579: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.20586: when evaluation is False, skipping this task 34296 1726855348.20595: _execute() done 34296 1726855348.20601: dumping result to json 34296 1726855348.20609: done dumping result, returning 34296 1726855348.20626: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-a97a-1acc-0000000000b1] 34296 1726855348.20635: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b1 34296 1726855348.20993: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b1 34296 1726855348.20996: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855348.21033: no more pending results, returning what we have 34296 1726855348.21036: results queue empty 34296 1726855348.21037: checking for any_errors_fatal 34296 1726855348.21042: done checking for any_errors_fatal 34296 1726855348.21042: checking for max_fail_percentage 34296 1726855348.21044: done checking for max_fail_percentage 34296 1726855348.21045: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.21045: done checking to see if all hosts have failed 34296 1726855348.21046: getting the remaining hosts for this loop 34296 1726855348.21047: done getting the remaining hosts for this loop 34296 1726855348.21050: getting the next task for host managed_node1 34296 1726855348.21056: done getting next task for host managed_node1 34296 1726855348.21060: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34296 1726855348.21062: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.21082: getting variables 34296 1726855348.21083: in VariableManager get_vars() 34296 1726855348.21126: Calling all_inventory to load vars for managed_node1 34296 1726855348.21129: Calling groups_inventory to load vars for managed_node1 34296 1726855348.21131: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.21139: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.21142: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.21144: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.21350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.21575: done with get_vars() 34296 1726855348.21586: done getting variables 34296 1726855348.21643: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:02:28 -0400 (0:00:00.024) 0:00:04.254 ****** 34296 1726855348.21676: entering _queue_task() for managed_node1/fail 34296 1726855348.21956: worker is 1 (out of 1 available) 34296 1726855348.21971: exiting _queue_task() for managed_node1/fail 34296 1726855348.21982: done queuing things up, now waiting for results queue to drain 34296 1726855348.21983: waiting for pending results... 34296 1726855348.22256: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34296 1726855348.22401: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000b2 34296 1726855348.22427: variable 'ansible_search_path' from source: unknown 34296 1726855348.22438: variable 'ansible_search_path' from source: unknown 34296 1726855348.22482: calling self._execute() 34296 1726855348.22580: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.22593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.22608: variable 'omit' from source: magic vars 34296 1726855348.23006: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.23022: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.23131: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.23140: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.23146: when evaluation is False, skipping this task 34296 1726855348.23152: _execute() done 34296 1726855348.23156: dumping result to json 34296 1726855348.23162: done dumping result, returning 34296 1726855348.23179: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-a97a-1acc-0000000000b2] 34296 1726855348.23191: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b2 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.23344: no more pending results, returning what we have 34296 1726855348.23349: results queue empty 34296 1726855348.23350: checking for any_errors_fatal 34296 1726855348.23357: done checking for any_errors_fatal 34296 1726855348.23358: checking for max_fail_percentage 34296 1726855348.23361: done checking for max_fail_percentage 34296 1726855348.23361: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.23362: done checking to see if all hosts have failed 34296 1726855348.23363: getting the remaining hosts for this loop 34296 1726855348.23364: done getting the remaining hosts for this loop 34296 1726855348.23373: getting the next task for host managed_node1 34296 1726855348.23380: done getting next task for host managed_node1 34296 1726855348.23385: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34296 1726855348.23390: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.23411: getting variables 34296 1726855348.23413: in VariableManager get_vars() 34296 1726855348.23471: Calling all_inventory to load vars for managed_node1 34296 1726855348.23475: Calling groups_inventory to load vars for managed_node1 34296 1726855348.23478: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.23793: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.23798: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.23802: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.24035: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b2 34296 1726855348.24039: WORKER PROCESS EXITING 34296 1726855348.24064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.24276: done with get_vars() 34296 1726855348.24289: done getting variables 34296 1726855348.24351: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:02:28 -0400 (0:00:00.027) 0:00:04.281 ****** 34296 1726855348.24390: entering _queue_task() for managed_node1/fail 34296 1726855348.24804: worker is 1 (out of 1 available) 34296 1726855348.24817: exiting _queue_task() for managed_node1/fail 34296 1726855348.24829: done queuing things up, now waiting for results queue to drain 34296 1726855348.24830: waiting for pending results... 34296 1726855348.25028: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34296 1726855348.25168: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000b3 34296 1726855348.25192: variable 'ansible_search_path' from source: unknown 34296 1726855348.25202: variable 'ansible_search_path' from source: unknown 34296 1726855348.25243: calling self._execute() 34296 1726855348.25338: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.25351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.25369: variable 'omit' from source: magic vars 34296 1726855348.25771: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.25794: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.25920: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.25931: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.25938: when evaluation is False, skipping this task 34296 1726855348.25944: _execute() done 34296 1726855348.25951: dumping result to json 34296 1726855348.25958: done dumping result, returning 34296 1726855348.25971: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-a97a-1acc-0000000000b3] 34296 1726855348.25980: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b3 34296 1726855348.26084: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b3 34296 1726855348.26093: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.26139: no more pending results, returning what we have 34296 1726855348.26143: results queue empty 34296 1726855348.26144: checking for any_errors_fatal 34296 1726855348.26149: done checking for any_errors_fatal 34296 1726855348.26150: checking for max_fail_percentage 34296 1726855348.26152: done checking for max_fail_percentage 34296 1726855348.26152: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.26153: done checking to see if all hosts have failed 34296 1726855348.26154: getting the remaining hosts for this loop 34296 1726855348.26155: done getting the remaining hosts for this loop 34296 1726855348.26158: getting the next task for host managed_node1 34296 1726855348.26167: done getting next task for host managed_node1 34296 1726855348.26171: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34296 1726855348.26175: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.26196: getting variables 34296 1726855348.26197: in VariableManager get_vars() 34296 1726855348.26241: Calling all_inventory to load vars for managed_node1 34296 1726855348.26243: Calling groups_inventory to load vars for managed_node1 34296 1726855348.26245: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.26259: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.26262: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.26267: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.26706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.26919: done with get_vars() 34296 1726855348.26930: done getting variables 34296 1726855348.26995: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:02:28 -0400 (0:00:00.026) 0:00:04.307 ****** 34296 1726855348.27028: entering _queue_task() for managed_node1/fail 34296 1726855348.27509: worker is 1 (out of 1 available) 34296 1726855348.27519: exiting _queue_task() for managed_node1/fail 34296 1726855348.27530: done queuing things up, now waiting for results queue to drain 34296 1726855348.27531: waiting for pending results... 34296 1726855348.27626: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34296 1726855348.27869: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000b4 34296 1726855348.27873: variable 'ansible_search_path' from source: unknown 34296 1726855348.27876: variable 'ansible_search_path' from source: unknown 34296 1726855348.27878: calling self._execute() 34296 1726855348.27930: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.27942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.27955: variable 'omit' from source: magic vars 34296 1726855348.28332: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.28348: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.28462: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.28476: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.28484: when evaluation is False, skipping this task 34296 1726855348.28492: _execute() done 34296 1726855348.28498: dumping result to json 34296 1726855348.28504: done dumping result, returning 34296 1726855348.28517: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-a97a-1acc-0000000000b4] 34296 1726855348.28526: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b4 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.28670: no more pending results, returning what we have 34296 1726855348.28675: results queue empty 34296 1726855348.28676: checking for any_errors_fatal 34296 1726855348.28683: done checking for any_errors_fatal 34296 1726855348.28684: checking for max_fail_percentage 34296 1726855348.28686: done checking for max_fail_percentage 34296 1726855348.28688: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.28689: done checking to see if all hosts have failed 34296 1726855348.28690: getting the remaining hosts for this loop 34296 1726855348.28691: done getting the remaining hosts for this loop 34296 1726855348.28695: getting the next task for host managed_node1 34296 1726855348.28701: done getting next task for host managed_node1 34296 1726855348.28705: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34296 1726855348.28708: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.28727: getting variables 34296 1726855348.28729: in VariableManager get_vars() 34296 1726855348.28778: Calling all_inventory to load vars for managed_node1 34296 1726855348.28782: Calling groups_inventory to load vars for managed_node1 34296 1726855348.28784: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.28900: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.28904: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.28908: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.29310: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b4 34296 1726855348.29313: WORKER PROCESS EXITING 34296 1726855348.29335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.29545: done with get_vars() 34296 1726855348.29555: done getting variables 34296 1726855348.29619: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:02:28 -0400 (0:00:00.026) 0:00:04.334 ****** 34296 1726855348.29648: entering _queue_task() for managed_node1/dnf 34296 1726855348.29903: worker is 1 (out of 1 available) 34296 1726855348.29916: exiting _queue_task() for managed_node1/dnf 34296 1726855348.29926: done queuing things up, now waiting for results queue to drain 34296 1726855348.29927: waiting for pending results... 34296 1726855348.30199: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34296 1726855348.30336: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000b5 34296 1726855348.30353: variable 'ansible_search_path' from source: unknown 34296 1726855348.30361: variable 'ansible_search_path' from source: unknown 34296 1726855348.30403: calling self._execute() 34296 1726855348.30492: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.30504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.30519: variable 'omit' from source: magic vars 34296 1726855348.30898: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.30915: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.31036: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.31049: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.31057: when evaluation is False, skipping this task 34296 1726855348.31063: _execute() done 34296 1726855348.31076: dumping result to json 34296 1726855348.31083: done dumping result, returning 34296 1726855348.31096: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-a97a-1acc-0000000000b5] 34296 1726855348.31105: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b5 34296 1726855348.31322: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b5 34296 1726855348.31326: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.31379: no more pending results, returning what we have 34296 1726855348.31383: results queue empty 34296 1726855348.31384: checking for any_errors_fatal 34296 1726855348.31392: done checking for any_errors_fatal 34296 1726855348.31393: checking for max_fail_percentage 34296 1726855348.31395: done checking for max_fail_percentage 34296 1726855348.31396: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.31397: done checking to see if all hosts have failed 34296 1726855348.31397: getting the remaining hosts for this loop 34296 1726855348.31399: done getting the remaining hosts for this loop 34296 1726855348.31403: getting the next task for host managed_node1 34296 1726855348.31409: done getting next task for host managed_node1 34296 1726855348.31413: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34296 1726855348.31416: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.31435: getting variables 34296 1726855348.31437: in VariableManager get_vars() 34296 1726855348.31486: Calling all_inventory to load vars for managed_node1 34296 1726855348.31652: Calling groups_inventory to load vars for managed_node1 34296 1726855348.31655: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.31664: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.31670: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.31673: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.31842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.32020: done with get_vars() 34296 1726855348.32030: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34296 1726855348.32107: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:02:28 -0400 (0:00:00.024) 0:00:04.359 ****** 34296 1726855348.32138: entering _queue_task() for managed_node1/yum 34296 1726855348.32422: worker is 1 (out of 1 available) 34296 1726855348.32436: exiting _queue_task() for managed_node1/yum 34296 1726855348.32448: done queuing things up, now waiting for results queue to drain 34296 1726855348.32449: waiting for pending results... 34296 1726855348.32737: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34296 1726855348.32877: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000b6 34296 1726855348.32912: variable 'ansible_search_path' from source: unknown 34296 1726855348.32916: variable 'ansible_search_path' from source: unknown 34296 1726855348.32950: calling self._execute() 34296 1726855348.33094: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.33098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.33101: variable 'omit' from source: magic vars 34296 1726855348.33473: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.33494: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.33617: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.33629: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.33637: when evaluation is False, skipping this task 34296 1726855348.33674: _execute() done 34296 1726855348.33677: dumping result to json 34296 1726855348.33680: done dumping result, returning 34296 1726855348.33683: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-a97a-1acc-0000000000b6] 34296 1726855348.33686: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b6 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.33942: no more pending results, returning what we have 34296 1726855348.33947: results queue empty 34296 1726855348.33948: checking for any_errors_fatal 34296 1726855348.33954: done checking for any_errors_fatal 34296 1726855348.33955: checking for max_fail_percentage 34296 1726855348.33957: done checking for max_fail_percentage 34296 1726855348.33958: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.33959: done checking to see if all hosts have failed 34296 1726855348.33960: getting the remaining hosts for this loop 34296 1726855348.33961: done getting the remaining hosts for this loop 34296 1726855348.33968: getting the next task for host managed_node1 34296 1726855348.33976: done getting next task for host managed_node1 34296 1726855348.33980: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34296 1726855348.33983: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.34005: getting variables 34296 1726855348.34007: in VariableManager get_vars() 34296 1726855348.34057: Calling all_inventory to load vars for managed_node1 34296 1726855348.34060: Calling groups_inventory to load vars for managed_node1 34296 1726855348.34062: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.34077: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.34081: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.34084: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.34300: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b6 34296 1726855348.34303: WORKER PROCESS EXITING 34296 1726855348.34526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.34731: done with get_vars() 34296 1726855348.34742: done getting variables 34296 1726855348.34806: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:02:28 -0400 (0:00:00.026) 0:00:04.386 ****** 34296 1726855348.34840: entering _queue_task() for managed_node1/fail 34296 1726855348.35132: worker is 1 (out of 1 available) 34296 1726855348.35145: exiting _queue_task() for managed_node1/fail 34296 1726855348.35157: done queuing things up, now waiting for results queue to drain 34296 1726855348.35158: waiting for pending results... 34296 1726855348.35428: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34296 1726855348.35550: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000b7 34296 1726855348.35569: variable 'ansible_search_path' from source: unknown 34296 1726855348.35577: variable 'ansible_search_path' from source: unknown 34296 1726855348.35621: calling self._execute() 34296 1726855348.35722: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.35735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.35750: variable 'omit' from source: magic vars 34296 1726855348.36139: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.36162: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.36290: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.36302: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.36310: when evaluation is False, skipping this task 34296 1726855348.36317: _execute() done 34296 1726855348.36324: dumping result to json 34296 1726855348.36332: done dumping result, returning 34296 1726855348.36374: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-a97a-1acc-0000000000b7] 34296 1726855348.36378: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b7 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.36729: no more pending results, returning what we have 34296 1726855348.36732: results queue empty 34296 1726855348.36732: checking for any_errors_fatal 34296 1726855348.36737: done checking for any_errors_fatal 34296 1726855348.36738: checking for max_fail_percentage 34296 1726855348.36740: done checking for max_fail_percentage 34296 1726855348.36740: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.36741: done checking to see if all hosts have failed 34296 1726855348.36742: getting the remaining hosts for this loop 34296 1726855348.36743: done getting the remaining hosts for this loop 34296 1726855348.36746: getting the next task for host managed_node1 34296 1726855348.36752: done getting next task for host managed_node1 34296 1726855348.36756: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34296 1726855348.36759: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.36779: getting variables 34296 1726855348.36780: in VariableManager get_vars() 34296 1726855348.36824: Calling all_inventory to load vars for managed_node1 34296 1726855348.36827: Calling groups_inventory to load vars for managed_node1 34296 1726855348.36830: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.36838: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.36841: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.36844: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.37060: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b7 34296 1726855348.37064: WORKER PROCESS EXITING 34296 1726855348.37092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.37296: done with get_vars() 34296 1726855348.37307: done getting variables 34296 1726855348.37364: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:02:28 -0400 (0:00:00.025) 0:00:04.411 ****** 34296 1726855348.37399: entering _queue_task() for managed_node1/package 34296 1726855348.37660: worker is 1 (out of 1 available) 34296 1726855348.37675: exiting _queue_task() for managed_node1/package 34296 1726855348.37688: done queuing things up, now waiting for results queue to drain 34296 1726855348.37792: waiting for pending results... 34296 1726855348.37958: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 34296 1726855348.38092: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000b8 34296 1726855348.38112: variable 'ansible_search_path' from source: unknown 34296 1726855348.38123: variable 'ansible_search_path' from source: unknown 34296 1726855348.38163: calling self._execute() 34296 1726855348.38256: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.38270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.38285: variable 'omit' from source: magic vars 34296 1726855348.38645: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.38669: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.38781: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.38797: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.38804: when evaluation is False, skipping this task 34296 1726855348.38812: _execute() done 34296 1726855348.38894: dumping result to json 34296 1726855348.38897: done dumping result, returning 34296 1726855348.38900: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-a97a-1acc-0000000000b8] 34296 1726855348.38902: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b8 34296 1726855348.38975: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b8 34296 1726855348.38978: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.39042: no more pending results, returning what we have 34296 1726855348.39046: results queue empty 34296 1726855348.39047: checking for any_errors_fatal 34296 1726855348.39055: done checking for any_errors_fatal 34296 1726855348.39056: checking for max_fail_percentage 34296 1726855348.39057: done checking for max_fail_percentage 34296 1726855348.39058: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.39059: done checking to see if all hosts have failed 34296 1726855348.39060: getting the remaining hosts for this loop 34296 1726855348.39061: done getting the remaining hosts for this loop 34296 1726855348.39064: getting the next task for host managed_node1 34296 1726855348.39074: done getting next task for host managed_node1 34296 1726855348.39077: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34296 1726855348.39080: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.39101: getting variables 34296 1726855348.39103: in VariableManager get_vars() 34296 1726855348.39148: Calling all_inventory to load vars for managed_node1 34296 1726855348.39151: Calling groups_inventory to load vars for managed_node1 34296 1726855348.39153: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.39164: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.39169: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.39172: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.39605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.39811: done with get_vars() 34296 1726855348.39821: done getting variables 34296 1726855348.39880: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:02:28 -0400 (0:00:00.025) 0:00:04.436 ****** 34296 1726855348.39911: entering _queue_task() for managed_node1/package 34296 1726855348.40169: worker is 1 (out of 1 available) 34296 1726855348.40181: exiting _queue_task() for managed_node1/package 34296 1726855348.40396: done queuing things up, now waiting for results queue to drain 34296 1726855348.40398: waiting for pending results... 34296 1726855348.40526: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34296 1726855348.40605: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000b9 34296 1726855348.40628: variable 'ansible_search_path' from source: unknown 34296 1726855348.40635: variable 'ansible_search_path' from source: unknown 34296 1726855348.40676: calling self._execute() 34296 1726855348.40792: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.40796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.40798: variable 'omit' from source: magic vars 34296 1726855348.41153: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.41178: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.41301: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.41382: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.41386: when evaluation is False, skipping this task 34296 1726855348.41390: _execute() done 34296 1726855348.41392: dumping result to json 34296 1726855348.41394: done dumping result, returning 34296 1726855348.41397: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-a97a-1acc-0000000000b9] 34296 1726855348.41400: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b9 34296 1726855348.41471: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000b9 34296 1726855348.41475: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.41530: no more pending results, returning what we have 34296 1726855348.41534: results queue empty 34296 1726855348.41535: checking for any_errors_fatal 34296 1726855348.41538: done checking for any_errors_fatal 34296 1726855348.41539: checking for max_fail_percentage 34296 1726855348.41541: done checking for max_fail_percentage 34296 1726855348.41541: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.41542: done checking to see if all hosts have failed 34296 1726855348.41543: getting the remaining hosts for this loop 34296 1726855348.41544: done getting the remaining hosts for this loop 34296 1726855348.41548: getting the next task for host managed_node1 34296 1726855348.41555: done getting next task for host managed_node1 34296 1726855348.41558: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34296 1726855348.41562: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.41584: getting variables 34296 1726855348.41586: in VariableManager get_vars() 34296 1726855348.41633: Calling all_inventory to load vars for managed_node1 34296 1726855348.41636: Calling groups_inventory to load vars for managed_node1 34296 1726855348.41638: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.41649: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.41652: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.41655: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.42062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.42277: done with get_vars() 34296 1726855348.42290: done getting variables 34296 1726855348.42349: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:02:28 -0400 (0:00:00.024) 0:00:04.461 ****** 34296 1726855348.42385: entering _queue_task() for managed_node1/package 34296 1726855348.42656: worker is 1 (out of 1 available) 34296 1726855348.42669: exiting _queue_task() for managed_node1/package 34296 1726855348.42681: done queuing things up, now waiting for results queue to drain 34296 1726855348.42682: waiting for pending results... 34296 1726855348.43098: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34296 1726855348.43110: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000ba 34296 1726855348.43131: variable 'ansible_search_path' from source: unknown 34296 1726855348.43139: variable 'ansible_search_path' from source: unknown 34296 1726855348.43184: calling self._execute() 34296 1726855348.43286: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.43302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.43319: variable 'omit' from source: magic vars 34296 1726855348.43806: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.43872: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.43948: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.43960: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.43971: when evaluation is False, skipping this task 34296 1726855348.43985: _execute() done 34296 1726855348.43995: dumping result to json 34296 1726855348.44003: done dumping result, returning 34296 1726855348.44016: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-a97a-1acc-0000000000ba] 34296 1726855348.44192: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000ba 34296 1726855348.44272: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000ba 34296 1726855348.44276: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.44328: no more pending results, returning what we have 34296 1726855348.44332: results queue empty 34296 1726855348.44334: checking for any_errors_fatal 34296 1726855348.44339: done checking for any_errors_fatal 34296 1726855348.44340: checking for max_fail_percentage 34296 1726855348.44342: done checking for max_fail_percentage 34296 1726855348.44343: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.44344: done checking to see if all hosts have failed 34296 1726855348.44344: getting the remaining hosts for this loop 34296 1726855348.44346: done getting the remaining hosts for this loop 34296 1726855348.44350: getting the next task for host managed_node1 34296 1726855348.44358: done getting next task for host managed_node1 34296 1726855348.44362: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34296 1726855348.44368: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.44393: getting variables 34296 1726855348.44395: in VariableManager get_vars() 34296 1726855348.44444: Calling all_inventory to load vars for managed_node1 34296 1726855348.44448: Calling groups_inventory to load vars for managed_node1 34296 1726855348.44450: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.44462: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.44468: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.44472: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.44846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.45047: done with get_vars() 34296 1726855348.45058: done getting variables 34296 1726855348.45122: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:02:28 -0400 (0:00:00.027) 0:00:04.489 ****** 34296 1726855348.45155: entering _queue_task() for managed_node1/service 34296 1726855348.45448: worker is 1 (out of 1 available) 34296 1726855348.45464: exiting _queue_task() for managed_node1/service 34296 1726855348.45479: done queuing things up, now waiting for results queue to drain 34296 1726855348.45480: waiting for pending results... 34296 1726855348.45903: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34296 1726855348.45908: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000bb 34296 1726855348.45911: variable 'ansible_search_path' from source: unknown 34296 1726855348.45913: variable 'ansible_search_path' from source: unknown 34296 1726855348.45955: calling self._execute() 34296 1726855348.46054: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.46068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.46084: variable 'omit' from source: magic vars 34296 1726855348.46472: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.46489: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.46605: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.46615: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.46621: when evaluation is False, skipping this task 34296 1726855348.46627: _execute() done 34296 1726855348.46632: dumping result to json 34296 1726855348.46638: done dumping result, returning 34296 1726855348.46648: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-a97a-1acc-0000000000bb] 34296 1726855348.46657: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000bb skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.46831: no more pending results, returning what we have 34296 1726855348.46835: results queue empty 34296 1726855348.46836: checking for any_errors_fatal 34296 1726855348.46843: done checking for any_errors_fatal 34296 1726855348.46843: checking for max_fail_percentage 34296 1726855348.46845: done checking for max_fail_percentage 34296 1726855348.46846: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.46847: done checking to see if all hosts have failed 34296 1726855348.46847: getting the remaining hosts for this loop 34296 1726855348.46849: done getting the remaining hosts for this loop 34296 1726855348.46852: getting the next task for host managed_node1 34296 1726855348.46859: done getting next task for host managed_node1 34296 1726855348.46862: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34296 1726855348.46868: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.46890: getting variables 34296 1726855348.46892: in VariableManager get_vars() 34296 1726855348.46938: Calling all_inventory to load vars for managed_node1 34296 1726855348.46941: Calling groups_inventory to load vars for managed_node1 34296 1726855348.46944: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.46955: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.46957: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.46960: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.47378: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000bb 34296 1726855348.47381: WORKER PROCESS EXITING 34296 1726855348.47407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.47613: done with get_vars() 34296 1726855348.47623: done getting variables 34296 1726855348.47685: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:02:28 -0400 (0:00:00.025) 0:00:04.514 ****** 34296 1726855348.47717: entering _queue_task() for managed_node1/service 34296 1726855348.47988: worker is 1 (out of 1 available) 34296 1726855348.48001: exiting _queue_task() for managed_node1/service 34296 1726855348.48012: done queuing things up, now waiting for results queue to drain 34296 1726855348.48013: waiting for pending results... 34296 1726855348.48276: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34296 1726855348.48415: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000bc 34296 1726855348.48435: variable 'ansible_search_path' from source: unknown 34296 1726855348.48444: variable 'ansible_search_path' from source: unknown 34296 1726855348.48485: calling self._execute() 34296 1726855348.48578: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.48592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.48609: variable 'omit' from source: magic vars 34296 1726855348.49061: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.49081: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.49271: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.49275: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.49277: when evaluation is False, skipping this task 34296 1726855348.49280: _execute() done 34296 1726855348.49282: dumping result to json 34296 1726855348.49284: done dumping result, returning 34296 1726855348.49288: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-a97a-1acc-0000000000bc] 34296 1726855348.49291: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000bc 34296 1726855348.49362: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000bc 34296 1726855348.49369: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34296 1726855348.49420: no more pending results, returning what we have 34296 1726855348.49424: results queue empty 34296 1726855348.49425: checking for any_errors_fatal 34296 1726855348.49431: done checking for any_errors_fatal 34296 1726855348.49432: checking for max_fail_percentage 34296 1726855348.49434: done checking for max_fail_percentage 34296 1726855348.49435: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.49436: done checking to see if all hosts have failed 34296 1726855348.49437: getting the remaining hosts for this loop 34296 1726855348.49438: done getting the remaining hosts for this loop 34296 1726855348.49442: getting the next task for host managed_node1 34296 1726855348.49448: done getting next task for host managed_node1 34296 1726855348.49452: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34296 1726855348.49457: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.49480: getting variables 34296 1726855348.49482: in VariableManager get_vars() 34296 1726855348.49532: Calling all_inventory to load vars for managed_node1 34296 1726855348.49535: Calling groups_inventory to load vars for managed_node1 34296 1726855348.49538: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.49550: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.49553: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.49557: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.50045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.50254: done with get_vars() 34296 1726855348.50264: done getting variables 34296 1726855348.50328: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:02:28 -0400 (0:00:00.026) 0:00:04.541 ****** 34296 1726855348.50358: entering _queue_task() for managed_node1/service 34296 1726855348.50625: worker is 1 (out of 1 available) 34296 1726855348.50638: exiting _queue_task() for managed_node1/service 34296 1726855348.50649: done queuing things up, now waiting for results queue to drain 34296 1726855348.50650: waiting for pending results... 34296 1726855348.51014: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34296 1726855348.51052: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000bd 34296 1726855348.51076: variable 'ansible_search_path' from source: unknown 34296 1726855348.51084: variable 'ansible_search_path' from source: unknown 34296 1726855348.51129: calling self._execute() 34296 1726855348.51224: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.51293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.51296: variable 'omit' from source: magic vars 34296 1726855348.51633: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.51653: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.51776: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.51789: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.51797: when evaluation is False, skipping this task 34296 1726855348.51805: _execute() done 34296 1726855348.51877: dumping result to json 34296 1726855348.51881: done dumping result, returning 34296 1726855348.51884: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-a97a-1acc-0000000000bd] 34296 1726855348.51886: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000bd 34296 1726855348.51959: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000bd 34296 1726855348.51962: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.52016: no more pending results, returning what we have 34296 1726855348.52019: results queue empty 34296 1726855348.52021: checking for any_errors_fatal 34296 1726855348.52027: done checking for any_errors_fatal 34296 1726855348.52028: checking for max_fail_percentage 34296 1726855348.52030: done checking for max_fail_percentage 34296 1726855348.52030: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.52031: done checking to see if all hosts have failed 34296 1726855348.52032: getting the remaining hosts for this loop 34296 1726855348.52033: done getting the remaining hosts for this loop 34296 1726855348.52037: getting the next task for host managed_node1 34296 1726855348.52044: done getting next task for host managed_node1 34296 1726855348.52048: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34296 1726855348.52052: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.52074: getting variables 34296 1726855348.52076: in VariableManager get_vars() 34296 1726855348.52126: Calling all_inventory to load vars for managed_node1 34296 1726855348.52129: Calling groups_inventory to load vars for managed_node1 34296 1726855348.52132: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.52143: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.52146: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.52149: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.52536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.52833: done with get_vars() 34296 1726855348.52844: done getting variables 34296 1726855348.52907: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:02:28 -0400 (0:00:00.025) 0:00:04.567 ****** 34296 1726855348.52939: entering _queue_task() for managed_node1/service 34296 1726855348.53406: worker is 1 (out of 1 available) 34296 1726855348.53416: exiting _queue_task() for managed_node1/service 34296 1726855348.53425: done queuing things up, now waiting for results queue to drain 34296 1726855348.53426: waiting for pending results... 34296 1726855348.53603: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 34296 1726855348.53652: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000be 34296 1726855348.53659: variable 'ansible_search_path' from source: unknown 34296 1726855348.53669: variable 'ansible_search_path' from source: unknown 34296 1726855348.53760: calling self._execute() 34296 1726855348.53804: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.53815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.53828: variable 'omit' from source: magic vars 34296 1726855348.54595: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.54799: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.55094: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.55097: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.55100: when evaluation is False, skipping this task 34296 1726855348.55102: _execute() done 34296 1726855348.55105: dumping result to json 34296 1726855348.55107: done dumping result, returning 34296 1726855348.55109: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-a97a-1acc-0000000000be] 34296 1726855348.55112: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000be 34296 1726855348.55184: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000be 34296 1726855348.55189: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34296 1726855348.55240: no more pending results, returning what we have 34296 1726855348.55243: results queue empty 34296 1726855348.55245: checking for any_errors_fatal 34296 1726855348.55251: done checking for any_errors_fatal 34296 1726855348.55252: checking for max_fail_percentage 34296 1726855348.55254: done checking for max_fail_percentage 34296 1726855348.55254: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.55255: done checking to see if all hosts have failed 34296 1726855348.55256: getting the remaining hosts for this loop 34296 1726855348.55257: done getting the remaining hosts for this loop 34296 1726855348.55261: getting the next task for host managed_node1 34296 1726855348.55270: done getting next task for host managed_node1 34296 1726855348.55274: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34296 1726855348.55278: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.55302: getting variables 34296 1726855348.55305: in VariableManager get_vars() 34296 1726855348.55356: Calling all_inventory to load vars for managed_node1 34296 1726855348.55359: Calling groups_inventory to load vars for managed_node1 34296 1726855348.55362: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.55376: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.55379: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.55382: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.55968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.56491: done with get_vars() 34296 1726855348.56503: done getting variables 34296 1726855348.56559: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:02:28 -0400 (0:00:00.036) 0:00:04.603 ****** 34296 1726855348.56597: entering _queue_task() for managed_node1/copy 34296 1726855348.57242: worker is 1 (out of 1 available) 34296 1726855348.57253: exiting _queue_task() for managed_node1/copy 34296 1726855348.57264: done queuing things up, now waiting for results queue to drain 34296 1726855348.57268: waiting for pending results... 34296 1726855348.57741: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34296 1726855348.57943: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000bf 34296 1726855348.57957: variable 'ansible_search_path' from source: unknown 34296 1726855348.57961: variable 'ansible_search_path' from source: unknown 34296 1726855348.58096: calling self._execute() 34296 1726855348.58238: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.58244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.58300: variable 'omit' from source: magic vars 34296 1726855348.59290: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.59294: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.59470: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.59474: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.59477: when evaluation is False, skipping this task 34296 1726855348.59480: _execute() done 34296 1726855348.59483: dumping result to json 34296 1726855348.59485: done dumping result, returning 34296 1726855348.59506: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-a97a-1acc-0000000000bf] 34296 1726855348.59509: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000bf 34296 1726855348.59614: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000bf 34296 1726855348.59617: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.59677: no more pending results, returning what we have 34296 1726855348.59680: results queue empty 34296 1726855348.59681: checking for any_errors_fatal 34296 1726855348.59686: done checking for any_errors_fatal 34296 1726855348.59686: checking for max_fail_percentage 34296 1726855348.59693: done checking for max_fail_percentage 34296 1726855348.59694: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.59695: done checking to see if all hosts have failed 34296 1726855348.59695: getting the remaining hosts for this loop 34296 1726855348.59697: done getting the remaining hosts for this loop 34296 1726855348.59700: getting the next task for host managed_node1 34296 1726855348.59708: done getting next task for host managed_node1 34296 1726855348.59712: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34296 1726855348.59715: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.59734: getting variables 34296 1726855348.59735: in VariableManager get_vars() 34296 1726855348.59783: Calling all_inventory to load vars for managed_node1 34296 1726855348.59786: Calling groups_inventory to load vars for managed_node1 34296 1726855348.59992: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.60002: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.60004: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.60007: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.60418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.60820: done with get_vars() 34296 1726855348.60832: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:02:28 -0400 (0:00:00.043) 0:00:04.646 ****** 34296 1726855348.60928: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34296 1726855348.61734: worker is 1 (out of 1 available) 34296 1726855348.61746: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34296 1726855348.61757: done queuing things up, now waiting for results queue to drain 34296 1726855348.61758: waiting for pending results... 34296 1726855348.62297: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34296 1726855348.62397: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000c0 34296 1726855348.62400: variable 'ansible_search_path' from source: unknown 34296 1726855348.62403: variable 'ansible_search_path' from source: unknown 34296 1726855348.62420: calling self._execute() 34296 1726855348.62516: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.62528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.62544: variable 'omit' from source: magic vars 34296 1726855348.62968: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.63049: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.63118: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.63129: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.63136: when evaluation is False, skipping this task 34296 1726855348.63142: _execute() done 34296 1726855348.63148: dumping result to json 34296 1726855348.63161: done dumping result, returning 34296 1726855348.63175: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-a97a-1acc-0000000000c0] 34296 1726855348.63184: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000c0 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.63440: no more pending results, returning what we have 34296 1726855348.63444: results queue empty 34296 1726855348.63445: checking for any_errors_fatal 34296 1726855348.63452: done checking for any_errors_fatal 34296 1726855348.63455: checking for max_fail_percentage 34296 1726855348.63456: done checking for max_fail_percentage 34296 1726855348.63457: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.63458: done checking to see if all hosts have failed 34296 1726855348.63458: getting the remaining hosts for this loop 34296 1726855348.63460: done getting the remaining hosts for this loop 34296 1726855348.63464: getting the next task for host managed_node1 34296 1726855348.63472: done getting next task for host managed_node1 34296 1726855348.63476: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34296 1726855348.63480: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.63501: getting variables 34296 1726855348.63503: in VariableManager get_vars() 34296 1726855348.63552: Calling all_inventory to load vars for managed_node1 34296 1726855348.63555: Calling groups_inventory to load vars for managed_node1 34296 1726855348.63558: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.63572: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.63575: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.63578: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.63949: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000c0 34296 1726855348.63953: WORKER PROCESS EXITING 34296 1726855348.63978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.64194: done with get_vars() 34296 1726855348.64207: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:02:28 -0400 (0:00:00.033) 0:00:04.680 ****** 34296 1726855348.64301: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34296 1726855348.64699: worker is 1 (out of 1 available) 34296 1726855348.64710: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34296 1726855348.64719: done queuing things up, now waiting for results queue to drain 34296 1726855348.64720: waiting for pending results... 34296 1726855348.64896: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 34296 1726855348.65028: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000c1 34296 1726855348.65049: variable 'ansible_search_path' from source: unknown 34296 1726855348.65060: variable 'ansible_search_path' from source: unknown 34296 1726855348.65103: calling self._execute() 34296 1726855348.65206: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.65217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.65229: variable 'omit' from source: magic vars 34296 1726855348.65695: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.65711: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.65820: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.65831: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.65838: when evaluation is False, skipping this task 34296 1726855348.65844: _execute() done 34296 1726855348.65850: dumping result to json 34296 1726855348.65858: done dumping result, returning 34296 1726855348.65875: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-a97a-1acc-0000000000c1] 34296 1726855348.65884: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000c1 34296 1726855348.66210: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000c1 34296 1726855348.66213: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.66254: no more pending results, returning what we have 34296 1726855348.66258: results queue empty 34296 1726855348.66259: checking for any_errors_fatal 34296 1726855348.66269: done checking for any_errors_fatal 34296 1726855348.66270: checking for max_fail_percentage 34296 1726855348.66271: done checking for max_fail_percentage 34296 1726855348.66272: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.66273: done checking to see if all hosts have failed 34296 1726855348.66274: getting the remaining hosts for this loop 34296 1726855348.66275: done getting the remaining hosts for this loop 34296 1726855348.66279: getting the next task for host managed_node1 34296 1726855348.66284: done getting next task for host managed_node1 34296 1726855348.66290: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34296 1726855348.66293: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.66310: getting variables 34296 1726855348.66311: in VariableManager get_vars() 34296 1726855348.66353: Calling all_inventory to load vars for managed_node1 34296 1726855348.66355: Calling groups_inventory to load vars for managed_node1 34296 1726855348.66358: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.66369: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.66372: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.66374: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.66637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.66820: done with get_vars() 34296 1726855348.66830: done getting variables 34296 1726855348.67090: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:02:28 -0400 (0:00:00.028) 0:00:04.708 ****** 34296 1726855348.67122: entering _queue_task() for managed_node1/debug 34296 1726855348.67616: worker is 1 (out of 1 available) 34296 1726855348.67631: exiting _queue_task() for managed_node1/debug 34296 1726855348.67642: done queuing things up, now waiting for results queue to drain 34296 1726855348.67643: waiting for pending results... 34296 1726855348.68083: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34296 1726855348.68412: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000c2 34296 1726855348.68509: variable 'ansible_search_path' from source: unknown 34296 1726855348.68514: variable 'ansible_search_path' from source: unknown 34296 1726855348.68555: calling self._execute() 34296 1726855348.68760: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.68764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.68775: variable 'omit' from source: magic vars 34296 1726855348.69458: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.69471: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.69670: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.69673: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.69676: when evaluation is False, skipping this task 34296 1726855348.69678: _execute() done 34296 1726855348.69681: dumping result to json 34296 1726855348.69683: done dumping result, returning 34296 1726855348.69694: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-a97a-1acc-0000000000c2] 34296 1726855348.69700: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000c2 34296 1726855348.69918: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000c2 34296 1726855348.69921: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855348.70006: no more pending results, returning what we have 34296 1726855348.70010: results queue empty 34296 1726855348.70010: checking for any_errors_fatal 34296 1726855348.70015: done checking for any_errors_fatal 34296 1726855348.70016: checking for max_fail_percentage 34296 1726855348.70017: done checking for max_fail_percentage 34296 1726855348.70018: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.70019: done checking to see if all hosts have failed 34296 1726855348.70020: getting the remaining hosts for this loop 34296 1726855348.70021: done getting the remaining hosts for this loop 34296 1726855348.70024: getting the next task for host managed_node1 34296 1726855348.70029: done getting next task for host managed_node1 34296 1726855348.70032: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34296 1726855348.70035: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.70051: getting variables 34296 1726855348.70052: in VariableManager get_vars() 34296 1726855348.70096: Calling all_inventory to load vars for managed_node1 34296 1726855348.70099: Calling groups_inventory to load vars for managed_node1 34296 1726855348.70101: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.70109: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.70112: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.70114: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.70300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.70509: done with get_vars() 34296 1726855348.70520: done getting variables 34296 1726855348.70585: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:02:28 -0400 (0:00:00.034) 0:00:04.743 ****** 34296 1726855348.70622: entering _queue_task() for managed_node1/debug 34296 1726855348.70928: worker is 1 (out of 1 available) 34296 1726855348.70943: exiting _queue_task() for managed_node1/debug 34296 1726855348.70957: done queuing things up, now waiting for results queue to drain 34296 1726855348.70958: waiting for pending results... 34296 1726855348.71237: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34296 1726855348.71520: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000c3 34296 1726855348.71744: variable 'ansible_search_path' from source: unknown 34296 1726855348.71748: variable 'ansible_search_path' from source: unknown 34296 1726855348.71751: calling self._execute() 34296 1726855348.71753: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.71756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.71758: variable 'omit' from source: magic vars 34296 1726855348.72619: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.72741: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.72945: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.73070: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.73073: when evaluation is False, skipping this task 34296 1726855348.73076: _execute() done 34296 1726855348.73078: dumping result to json 34296 1726855348.73080: done dumping result, returning 34296 1726855348.73177: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-a97a-1acc-0000000000c3] 34296 1726855348.73181: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000c3 34296 1726855348.73253: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000c3 34296 1726855348.73257: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855348.73334: no more pending results, returning what we have 34296 1726855348.73337: results queue empty 34296 1726855348.73339: checking for any_errors_fatal 34296 1726855348.73348: done checking for any_errors_fatal 34296 1726855348.73349: checking for max_fail_percentage 34296 1726855348.73351: done checking for max_fail_percentage 34296 1726855348.73351: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.73352: done checking to see if all hosts have failed 34296 1726855348.73353: getting the remaining hosts for this loop 34296 1726855348.73355: done getting the remaining hosts for this loop 34296 1726855348.73359: getting the next task for host managed_node1 34296 1726855348.73368: done getting next task for host managed_node1 34296 1726855348.73372: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34296 1726855348.73376: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.73397: getting variables 34296 1726855348.73399: in VariableManager get_vars() 34296 1726855348.73447: Calling all_inventory to load vars for managed_node1 34296 1726855348.73451: Calling groups_inventory to load vars for managed_node1 34296 1726855348.73454: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.73468: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.73471: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.73474: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.73982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.74197: done with get_vars() 34296 1726855348.74207: done getting variables 34296 1726855348.74267: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:02:28 -0400 (0:00:00.036) 0:00:04.780 ****** 34296 1726855348.74300: entering _queue_task() for managed_node1/debug 34296 1726855348.74578: worker is 1 (out of 1 available) 34296 1726855348.74595: exiting _queue_task() for managed_node1/debug 34296 1726855348.74607: done queuing things up, now waiting for results queue to drain 34296 1726855348.74608: waiting for pending results... 34296 1726855348.74916: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34296 1726855348.74925: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000c4 34296 1726855348.74938: variable 'ansible_search_path' from source: unknown 34296 1726855348.74941: variable 'ansible_search_path' from source: unknown 34296 1726855348.74981: calling self._execute() 34296 1726855348.75068: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.75078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.75091: variable 'omit' from source: magic vars 34296 1726855348.75925: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.75929: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.76049: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.76056: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.76059: when evaluation is False, skipping this task 34296 1726855348.76062: _execute() done 34296 1726855348.76064: dumping result to json 34296 1726855348.76069: done dumping result, returning 34296 1726855348.76077: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-a97a-1acc-0000000000c4] 34296 1726855348.76082: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000c4 34296 1726855348.76212: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000c4 34296 1726855348.76215: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855348.76292: no more pending results, returning what we have 34296 1726855348.76296: results queue empty 34296 1726855348.76297: checking for any_errors_fatal 34296 1726855348.76303: done checking for any_errors_fatal 34296 1726855348.76303: checking for max_fail_percentage 34296 1726855348.76305: done checking for max_fail_percentage 34296 1726855348.76306: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.76307: done checking to see if all hosts have failed 34296 1726855348.76307: getting the remaining hosts for this loop 34296 1726855348.76309: done getting the remaining hosts for this loop 34296 1726855348.76312: getting the next task for host managed_node1 34296 1726855348.76319: done getting next task for host managed_node1 34296 1726855348.76326: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34296 1726855348.76330: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.76349: getting variables 34296 1726855348.76351: in VariableManager get_vars() 34296 1726855348.76398: Calling all_inventory to load vars for managed_node1 34296 1726855348.76400: Calling groups_inventory to load vars for managed_node1 34296 1726855348.76402: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.76410: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.76412: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.76415: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.76707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.76927: done with get_vars() 34296 1726855348.76938: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:02:28 -0400 (0:00:00.027) 0:00:04.808 ****** 34296 1726855348.77046: entering _queue_task() for managed_node1/ping 34296 1726855348.77355: worker is 1 (out of 1 available) 34296 1726855348.77370: exiting _queue_task() for managed_node1/ping 34296 1726855348.77383: done queuing things up, now waiting for results queue to drain 34296 1726855348.77385: waiting for pending results... 34296 1726855348.77668: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 34296 1726855348.77923: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000c5 34296 1726855348.77926: variable 'ansible_search_path' from source: unknown 34296 1726855348.77929: variable 'ansible_search_path' from source: unknown 34296 1726855348.77932: calling self._execute() 34296 1726855348.77969: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.77981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.77999: variable 'omit' from source: magic vars 34296 1726855348.78410: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.78430: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.78563: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.78728: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.78761: when evaluation is False, skipping this task 34296 1726855348.78772: _execute() done 34296 1726855348.78802: dumping result to json 34296 1726855348.78812: done dumping result, returning 34296 1726855348.79035: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-a97a-1acc-0000000000c5] 34296 1726855348.79039: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000c5 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.79254: no more pending results, returning what we have 34296 1726855348.79258: results queue empty 34296 1726855348.79259: checking for any_errors_fatal 34296 1726855348.79268: done checking for any_errors_fatal 34296 1726855348.79269: checking for max_fail_percentage 34296 1726855348.79271: done checking for max_fail_percentage 34296 1726855348.79272: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.79273: done checking to see if all hosts have failed 34296 1726855348.79274: getting the remaining hosts for this loop 34296 1726855348.79276: done getting the remaining hosts for this loop 34296 1726855348.79281: getting the next task for host managed_node1 34296 1726855348.79365: done getting next task for host managed_node1 34296 1726855348.79371: ^ task is: TASK: meta (role_complete) 34296 1726855348.79374: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855348.79410: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000c5 34296 1726855348.79413: WORKER PROCESS EXITING 34296 1726855348.79423: getting variables 34296 1726855348.79425: in VariableManager get_vars() 34296 1726855348.79593: Calling all_inventory to load vars for managed_node1 34296 1726855348.79596: Calling groups_inventory to load vars for managed_node1 34296 1726855348.79598: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.79607: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.79610: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.79612: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.80013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.80470: done with get_vars() 34296 1726855348.80482: done getting variables 34296 1726855348.80724: done queuing things up, now waiting for results queue to drain 34296 1726855348.80726: results queue empty 34296 1726855348.80727: checking for any_errors_fatal 34296 1726855348.80730: done checking for any_errors_fatal 34296 1726855348.80730: checking for max_fail_percentage 34296 1726855348.80732: done checking for max_fail_percentage 34296 1726855348.80732: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.80733: done checking to see if all hosts have failed 34296 1726855348.80734: getting the remaining hosts for this loop 34296 1726855348.80734: done getting the remaining hosts for this loop 34296 1726855348.80737: getting the next task for host managed_node1 34296 1726855348.80743: done getting next task for host managed_node1 34296 1726855348.80746: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34296 1726855348.80748: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855348.80757: getting variables 34296 1726855348.80758: in VariableManager get_vars() 34296 1726855348.80781: Calling all_inventory to load vars for managed_node1 34296 1726855348.80783: Calling groups_inventory to load vars for managed_node1 34296 1726855348.80785: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.80893: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.80896: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.80899: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.81044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.81416: done with get_vars() 34296 1726855348.81429: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 14:02:28 -0400 (0:00:00.044) 0:00:04.852 ****** 34296 1726855348.81508: entering _queue_task() for managed_node1/include_tasks 34296 1726855348.82215: worker is 1 (out of 1 available) 34296 1726855348.82228: exiting _queue_task() for managed_node1/include_tasks 34296 1726855348.82240: done queuing things up, now waiting for results queue to drain 34296 1726855348.82242: waiting for pending results... 34296 1726855348.82448: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34296 1726855348.82592: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000fd 34296 1726855348.82613: variable 'ansible_search_path' from source: unknown 34296 1726855348.82629: variable 'ansible_search_path' from source: unknown 34296 1726855348.82738: calling self._execute() 34296 1726855348.82774: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.82793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.82808: variable 'omit' from source: magic vars 34296 1726855348.83400: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.83420: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.83722: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.83734: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.83814: when evaluation is False, skipping this task 34296 1726855348.83817: _execute() done 34296 1726855348.83819: dumping result to json 34296 1726855348.83935: done dumping result, returning 34296 1726855348.83940: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcc66-ac2b-a97a-1acc-0000000000fd] 34296 1726855348.83943: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000fd 34296 1726855348.84394: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000fd 34296 1726855348.84397: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.84629: no more pending results, returning what we have 34296 1726855348.84633: results queue empty 34296 1726855348.84634: checking for any_errors_fatal 34296 1726855348.84635: done checking for any_errors_fatal 34296 1726855348.84636: checking for max_fail_percentage 34296 1726855348.84637: done checking for max_fail_percentage 34296 1726855348.84638: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.84639: done checking to see if all hosts have failed 34296 1726855348.84640: getting the remaining hosts for this loop 34296 1726855348.84641: done getting the remaining hosts for this loop 34296 1726855348.84645: getting the next task for host managed_node1 34296 1726855348.84651: done getting next task for host managed_node1 34296 1726855348.84655: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34296 1726855348.84659: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855348.84678: getting variables 34296 1726855348.84680: in VariableManager get_vars() 34296 1726855348.84726: Calling all_inventory to load vars for managed_node1 34296 1726855348.84729: Calling groups_inventory to load vars for managed_node1 34296 1726855348.84732: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.84741: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.84744: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.84747: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.86547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.86960: done with get_vars() 34296 1726855348.86975: done getting variables 34296 1726855348.87036: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 14:02:28 -0400 (0:00:00.055) 0:00:04.908 ****** 34296 1726855348.87073: entering _queue_task() for managed_node1/debug 34296 1726855348.87997: worker is 1 (out of 1 available) 34296 1726855348.88005: exiting _queue_task() for managed_node1/debug 34296 1726855348.88015: done queuing things up, now waiting for results queue to drain 34296 1726855348.88016: waiting for pending results... 34296 1726855348.88389: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 34296 1726855348.88559: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000fe 34296 1726855348.88615: variable 'ansible_search_path' from source: unknown 34296 1726855348.88619: variable 'ansible_search_path' from source: unknown 34296 1726855348.88757: calling self._execute() 34296 1726855348.88836: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.88898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.88909: variable 'omit' from source: magic vars 34296 1726855348.89702: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.89793: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.89962: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.89970: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.89973: when evaluation is False, skipping this task 34296 1726855348.89976: _execute() done 34296 1726855348.89979: dumping result to json 34296 1726855348.89981: done dumping result, returning 34296 1726855348.89988: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcc66-ac2b-a97a-1acc-0000000000fe] 34296 1726855348.89995: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000fe 34296 1726855348.90090: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000fe 34296 1726855348.90094: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855348.90177: no more pending results, returning what we have 34296 1726855348.90181: results queue empty 34296 1726855348.90182: checking for any_errors_fatal 34296 1726855348.90191: done checking for any_errors_fatal 34296 1726855348.90192: checking for max_fail_percentage 34296 1726855348.90194: done checking for max_fail_percentage 34296 1726855348.90195: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.90196: done checking to see if all hosts have failed 34296 1726855348.90196: getting the remaining hosts for this loop 34296 1726855348.90198: done getting the remaining hosts for this loop 34296 1726855348.90201: getting the next task for host managed_node1 34296 1726855348.90209: done getting next task for host managed_node1 34296 1726855348.90213: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34296 1726855348.90218: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855348.90238: getting variables 34296 1726855348.90240: in VariableManager get_vars() 34296 1726855348.90592: Calling all_inventory to load vars for managed_node1 34296 1726855348.90595: Calling groups_inventory to load vars for managed_node1 34296 1726855348.90598: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.90607: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.90610: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.90613: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.90784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.91294: done with get_vars() 34296 1726855348.91308: done getting variables 34296 1726855348.91371: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 14:02:28 -0400 (0:00:00.043) 0:00:04.951 ****** 34296 1726855348.91410: entering _queue_task() for managed_node1/fail 34296 1726855348.91851: worker is 1 (out of 1 available) 34296 1726855348.91871: exiting _queue_task() for managed_node1/fail 34296 1726855348.91884: done queuing things up, now waiting for results queue to drain 34296 1726855348.91886: waiting for pending results... 34296 1726855348.92304: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34296 1726855348.92355: in run() - task 0affcc66-ac2b-a97a-1acc-0000000000ff 34296 1726855348.92380: variable 'ansible_search_path' from source: unknown 34296 1726855348.92392: variable 'ansible_search_path' from source: unknown 34296 1726855348.92441: calling self._execute() 34296 1726855348.92537: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.92550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.92568: variable 'omit' from source: magic vars 34296 1726855348.92974: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.92995: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.93119: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.93131: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.93192: when evaluation is False, skipping this task 34296 1726855348.93195: _execute() done 34296 1726855348.93198: dumping result to json 34296 1726855348.93201: done dumping result, returning 34296 1726855348.93204: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcc66-ac2b-a97a-1acc-0000000000ff] 34296 1726855348.93206: sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000ff skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.93344: no more pending results, returning what we have 34296 1726855348.93349: results queue empty 34296 1726855348.93350: checking for any_errors_fatal 34296 1726855348.93356: done checking for any_errors_fatal 34296 1726855348.93357: checking for max_fail_percentage 34296 1726855348.93359: done checking for max_fail_percentage 34296 1726855348.93359: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.93360: done checking to see if all hosts have failed 34296 1726855348.93361: getting the remaining hosts for this loop 34296 1726855348.93362: done getting the remaining hosts for this loop 34296 1726855348.93370: getting the next task for host managed_node1 34296 1726855348.93377: done getting next task for host managed_node1 34296 1726855348.93381: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34296 1726855348.93388: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855348.93414: getting variables 34296 1726855348.93417: in VariableManager get_vars() 34296 1726855348.93472: Calling all_inventory to load vars for managed_node1 34296 1726855348.93476: Calling groups_inventory to load vars for managed_node1 34296 1726855348.93478: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.93695: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.93699: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.93703: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.94004: done sending task result for task 0affcc66-ac2b-a97a-1acc-0000000000ff 34296 1726855348.94007: WORKER PROCESS EXITING 34296 1726855348.94035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.94252: done with get_vars() 34296 1726855348.94267: done getting variables 34296 1726855348.94321: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 14:02:28 -0400 (0:00:00.029) 0:00:04.981 ****** 34296 1726855348.94356: entering _queue_task() for managed_node1/fail 34296 1726855348.94637: worker is 1 (out of 1 available) 34296 1726855348.94652: exiting _queue_task() for managed_node1/fail 34296 1726855348.94664: done queuing things up, now waiting for results queue to drain 34296 1726855348.94781: waiting for pending results... 34296 1726855348.94967: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34296 1726855348.95125: in run() - task 0affcc66-ac2b-a97a-1acc-000000000100 34296 1726855348.95147: variable 'ansible_search_path' from source: unknown 34296 1726855348.95155: variable 'ansible_search_path' from source: unknown 34296 1726855348.95198: calling self._execute() 34296 1726855348.95298: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.95309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.95324: variable 'omit' from source: magic vars 34296 1726855348.95718: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.95736: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.95859: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.95877: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.95890: when evaluation is False, skipping this task 34296 1726855348.95899: _execute() done 34296 1726855348.95906: dumping result to json 34296 1726855348.95914: done dumping result, returning 34296 1726855348.95924: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcc66-ac2b-a97a-1acc-000000000100] 34296 1726855348.95935: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000100 34296 1726855348.96062: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000100 34296 1726855348.96068: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.96154: no more pending results, returning what we have 34296 1726855348.96159: results queue empty 34296 1726855348.96160: checking for any_errors_fatal 34296 1726855348.96169: done checking for any_errors_fatal 34296 1726855348.96170: checking for max_fail_percentage 34296 1726855348.96172: done checking for max_fail_percentage 34296 1726855348.96173: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.96174: done checking to see if all hosts have failed 34296 1726855348.96175: getting the remaining hosts for this loop 34296 1726855348.96176: done getting the remaining hosts for this loop 34296 1726855348.96180: getting the next task for host managed_node1 34296 1726855348.96190: done getting next task for host managed_node1 34296 1726855348.96194: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34296 1726855348.96199: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855348.96337: getting variables 34296 1726855348.96340: in VariableManager get_vars() 34296 1726855348.96444: Calling all_inventory to load vars for managed_node1 34296 1726855348.96447: Calling groups_inventory to load vars for managed_node1 34296 1726855348.96450: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.96460: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.96463: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.96468: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.96750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.96982: done with get_vars() 34296 1726855348.96996: done getting variables 34296 1726855348.97052: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 14:02:28 -0400 (0:00:00.027) 0:00:05.008 ****** 34296 1726855348.97097: entering _queue_task() for managed_node1/fail 34296 1726855348.97445: worker is 1 (out of 1 available) 34296 1726855348.97458: exiting _queue_task() for managed_node1/fail 34296 1726855348.97473: done queuing things up, now waiting for results queue to drain 34296 1726855348.97474: waiting for pending results... 34296 1726855348.97806: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34296 1726855348.97899: in run() - task 0affcc66-ac2b-a97a-1acc-000000000101 34296 1726855348.97919: variable 'ansible_search_path' from source: unknown 34296 1726855348.97928: variable 'ansible_search_path' from source: unknown 34296 1726855348.97977: calling self._execute() 34296 1726855348.98185: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855348.98191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855348.98194: variable 'omit' from source: magic vars 34296 1726855348.98494: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.98517: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855348.98640: variable 'ansible_distribution_major_version' from source: facts 34296 1726855348.98651: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855348.98659: when evaluation is False, skipping this task 34296 1726855348.98665: _execute() done 34296 1726855348.98674: dumping result to json 34296 1726855348.98680: done dumping result, returning 34296 1726855348.98692: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcc66-ac2b-a97a-1acc-000000000101] 34296 1726855348.98700: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000101 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855348.98880: no more pending results, returning what we have 34296 1726855348.98884: results queue empty 34296 1726855348.98885: checking for any_errors_fatal 34296 1726855348.98892: done checking for any_errors_fatal 34296 1726855348.98892: checking for max_fail_percentage 34296 1726855348.98894: done checking for max_fail_percentage 34296 1726855348.98895: checking to see if all hosts have failed and the running result is not ok 34296 1726855348.98896: done checking to see if all hosts have failed 34296 1726855348.98896: getting the remaining hosts for this loop 34296 1726855348.98898: done getting the remaining hosts for this loop 34296 1726855348.98902: getting the next task for host managed_node1 34296 1726855348.98908: done getting next task for host managed_node1 34296 1726855348.98912: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34296 1726855348.98916: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855348.99049: getting variables 34296 1726855348.99052: in VariableManager get_vars() 34296 1726855348.99107: Calling all_inventory to load vars for managed_node1 34296 1726855348.99110: Calling groups_inventory to load vars for managed_node1 34296 1726855348.99113: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855348.99124: Calling all_plugins_play to load vars for managed_node1 34296 1726855348.99126: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855348.99129: Calling groups_plugins_play to load vars for managed_node1 34296 1726855348.99516: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000101 34296 1726855348.99520: WORKER PROCESS EXITING 34296 1726855348.99542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855348.99745: done with get_vars() 34296 1726855348.99755: done getting variables 34296 1726855348.99819: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 14:02:28 -0400 (0:00:00.027) 0:00:05.036 ****** 34296 1726855348.99850: entering _queue_task() for managed_node1/dnf 34296 1726855349.00250: worker is 1 (out of 1 available) 34296 1726855349.00262: exiting _queue_task() for managed_node1/dnf 34296 1726855349.00275: done queuing things up, now waiting for results queue to drain 34296 1726855349.00277: waiting for pending results... 34296 1726855349.00449: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34296 1726855349.00603: in run() - task 0affcc66-ac2b-a97a-1acc-000000000102 34296 1726855349.00629: variable 'ansible_search_path' from source: unknown 34296 1726855349.00638: variable 'ansible_search_path' from source: unknown 34296 1726855349.00685: calling self._execute() 34296 1726855349.00781: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.00799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.00814: variable 'omit' from source: magic vars 34296 1726855349.01217: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.01238: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.01358: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.01379: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.01389: when evaluation is False, skipping this task 34296 1726855349.01398: _execute() done 34296 1726855349.01405: dumping result to json 34296 1726855349.01412: done dumping result, returning 34296 1726855349.01423: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcc66-ac2b-a97a-1acc-000000000102] 34296 1726855349.01430: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000102 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855349.01597: no more pending results, returning what we have 34296 1726855349.01601: results queue empty 34296 1726855349.01602: checking for any_errors_fatal 34296 1726855349.01610: done checking for any_errors_fatal 34296 1726855349.01611: checking for max_fail_percentage 34296 1726855349.01613: done checking for max_fail_percentage 34296 1726855349.01613: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.01614: done checking to see if all hosts have failed 34296 1726855349.01615: getting the remaining hosts for this loop 34296 1726855349.01616: done getting the remaining hosts for this loop 34296 1726855349.01619: getting the next task for host managed_node1 34296 1726855349.01626: done getting next task for host managed_node1 34296 1726855349.01630: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34296 1726855349.01634: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.01652: getting variables 34296 1726855349.01653: in VariableManager get_vars() 34296 1726855349.01710: Calling all_inventory to load vars for managed_node1 34296 1726855349.01713: Calling groups_inventory to load vars for managed_node1 34296 1726855349.01716: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.01726: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.01730: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.01733: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.02204: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000102 34296 1726855349.02208: WORKER PROCESS EXITING 34296 1726855349.02238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.02458: done with get_vars() 34296 1726855349.02471: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34296 1726855349.02553: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 14:02:29 -0400 (0:00:00.027) 0:00:05.063 ****** 34296 1726855349.02588: entering _queue_task() for managed_node1/yum 34296 1726855349.02986: worker is 1 (out of 1 available) 34296 1726855349.03000: exiting _queue_task() for managed_node1/yum 34296 1726855349.03009: done queuing things up, now waiting for results queue to drain 34296 1726855349.03011: waiting for pending results... 34296 1726855349.03203: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34296 1726855349.03353: in run() - task 0affcc66-ac2b-a97a-1acc-000000000103 34296 1726855349.03379: variable 'ansible_search_path' from source: unknown 34296 1726855349.03389: variable 'ansible_search_path' from source: unknown 34296 1726855349.03431: calling self._execute() 34296 1726855349.03538: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.03551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.03574: variable 'omit' from source: magic vars 34296 1726855349.03982: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.04007: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.04128: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.04140: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.04148: when evaluation is False, skipping this task 34296 1726855349.04155: _execute() done 34296 1726855349.04163: dumping result to json 34296 1726855349.04174: done dumping result, returning 34296 1726855349.04194: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcc66-ac2b-a97a-1acc-000000000103] 34296 1726855349.04293: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000103 34296 1726855349.04374: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000103 34296 1726855349.04377: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855349.04437: no more pending results, returning what we have 34296 1726855349.04441: results queue empty 34296 1726855349.04442: checking for any_errors_fatal 34296 1726855349.04449: done checking for any_errors_fatal 34296 1726855349.04449: checking for max_fail_percentage 34296 1726855349.04451: done checking for max_fail_percentage 34296 1726855349.04452: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.04452: done checking to see if all hosts have failed 34296 1726855349.04453: getting the remaining hosts for this loop 34296 1726855349.04454: done getting the remaining hosts for this loop 34296 1726855349.04458: getting the next task for host managed_node1 34296 1726855349.04469: done getting next task for host managed_node1 34296 1726855349.04474: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34296 1726855349.04478: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.04501: getting variables 34296 1726855349.04615: in VariableManager get_vars() 34296 1726855349.04676: Calling all_inventory to load vars for managed_node1 34296 1726855349.04680: Calling groups_inventory to load vars for managed_node1 34296 1726855349.04683: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.04796: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.04799: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.04803: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.05100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.05298: done with get_vars() 34296 1726855349.05309: done getting variables 34296 1726855349.05369: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 14:02:29 -0400 (0:00:00.028) 0:00:05.091 ****** 34296 1726855349.05411: entering _queue_task() for managed_node1/fail 34296 1726855349.05831: worker is 1 (out of 1 available) 34296 1726855349.05843: exiting _queue_task() for managed_node1/fail 34296 1726855349.05855: done queuing things up, now waiting for results queue to drain 34296 1726855349.05856: waiting for pending results... 34296 1726855349.06158: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34296 1726855349.06362: in run() - task 0affcc66-ac2b-a97a-1acc-000000000104 34296 1726855349.06368: variable 'ansible_search_path' from source: unknown 34296 1726855349.06372: variable 'ansible_search_path' from source: unknown 34296 1726855349.06375: calling self._execute() 34296 1726855349.06490: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.06793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.06797: variable 'omit' from source: magic vars 34296 1726855349.07383: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.07478: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.07727: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.07795: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.07803: when evaluation is False, skipping this task 34296 1726855349.07810: _execute() done 34296 1726855349.07817: dumping result to json 34296 1726855349.07825: done dumping result, returning 34296 1726855349.07838: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-a97a-1acc-000000000104] 34296 1726855349.07847: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000104 34296 1726855349.08169: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000104 34296 1726855349.08174: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855349.08231: no more pending results, returning what we have 34296 1726855349.08235: results queue empty 34296 1726855349.08236: checking for any_errors_fatal 34296 1726855349.08245: done checking for any_errors_fatal 34296 1726855349.08245: checking for max_fail_percentage 34296 1726855349.08247: done checking for max_fail_percentage 34296 1726855349.08248: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.08249: done checking to see if all hosts have failed 34296 1726855349.08249: getting the remaining hosts for this loop 34296 1726855349.08251: done getting the remaining hosts for this loop 34296 1726855349.08256: getting the next task for host managed_node1 34296 1726855349.08264: done getting next task for host managed_node1 34296 1726855349.08270: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34296 1726855349.08275: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.08300: getting variables 34296 1726855349.08302: in VariableManager get_vars() 34296 1726855349.08351: Calling all_inventory to load vars for managed_node1 34296 1726855349.08354: Calling groups_inventory to load vars for managed_node1 34296 1726855349.08357: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.08370: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.08373: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.08376: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.08782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.09579: done with get_vars() 34296 1726855349.09595: done getting variables 34296 1726855349.09701: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 14:02:29 -0400 (0:00:00.043) 0:00:05.135 ****** 34296 1726855349.09735: entering _queue_task() for managed_node1/package 34296 1726855349.10408: worker is 1 (out of 1 available) 34296 1726855349.10500: exiting _queue_task() for managed_node1/package 34296 1726855349.10514: done queuing things up, now waiting for results queue to drain 34296 1726855349.10515: waiting for pending results... 34296 1726855349.11133: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 34296 1726855349.11338: in run() - task 0affcc66-ac2b-a97a-1acc-000000000105 34296 1726855349.11342: variable 'ansible_search_path' from source: unknown 34296 1726855349.11345: variable 'ansible_search_path' from source: unknown 34296 1726855349.11426: calling self._execute() 34296 1726855349.11693: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.11697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.11699: variable 'omit' from source: magic vars 34296 1726855349.12457: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.12470: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.12762: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.12768: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.12775: when evaluation is False, skipping this task 34296 1726855349.12778: _execute() done 34296 1726855349.12781: dumping result to json 34296 1726855349.12783: done dumping result, returning 34296 1726855349.12794: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affcc66-ac2b-a97a-1acc-000000000105] 34296 1726855349.12800: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000105 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855349.13137: no more pending results, returning what we have 34296 1726855349.13141: results queue empty 34296 1726855349.13142: checking for any_errors_fatal 34296 1726855349.13150: done checking for any_errors_fatal 34296 1726855349.13151: checking for max_fail_percentage 34296 1726855349.13153: done checking for max_fail_percentage 34296 1726855349.13154: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.13155: done checking to see if all hosts have failed 34296 1726855349.13155: getting the remaining hosts for this loop 34296 1726855349.13157: done getting the remaining hosts for this loop 34296 1726855349.13161: getting the next task for host managed_node1 34296 1726855349.13171: done getting next task for host managed_node1 34296 1726855349.13175: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34296 1726855349.13180: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.13374: getting variables 34296 1726855349.13377: in VariableManager get_vars() 34296 1726855349.13700: Calling all_inventory to load vars for managed_node1 34296 1726855349.13703: Calling groups_inventory to load vars for managed_node1 34296 1726855349.13706: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.13716: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.13718: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.13722: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.14171: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000105 34296 1726855349.14175: WORKER PROCESS EXITING 34296 1726855349.14207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.14760: done with get_vars() 34296 1726855349.14776: done getting variables 34296 1726855349.15039: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 14:02:29 -0400 (0:00:00.053) 0:00:05.188 ****** 34296 1726855349.15078: entering _queue_task() for managed_node1/package 34296 1726855349.15781: worker is 1 (out of 1 available) 34296 1726855349.15798: exiting _queue_task() for managed_node1/package 34296 1726855349.15811: done queuing things up, now waiting for results queue to drain 34296 1726855349.15812: waiting for pending results... 34296 1726855349.16284: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34296 1726855349.16637: in run() - task 0affcc66-ac2b-a97a-1acc-000000000106 34296 1726855349.16682: variable 'ansible_search_path' from source: unknown 34296 1726855349.16686: variable 'ansible_search_path' from source: unknown 34296 1726855349.16730: calling self._execute() 34296 1726855349.17202: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.17209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.17218: variable 'omit' from source: magic vars 34296 1726855349.18316: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.18422: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.18626: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.18645: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.18654: when evaluation is False, skipping this task 34296 1726855349.18663: _execute() done 34296 1726855349.18671: dumping result to json 34296 1726855349.18677: done dumping result, returning 34296 1726855349.18858: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcc66-ac2b-a97a-1acc-000000000106] 34296 1726855349.18863: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000106 34296 1726855349.18943: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000106 34296 1726855349.18947: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855349.19011: no more pending results, returning what we have 34296 1726855349.19015: results queue empty 34296 1726855349.19016: checking for any_errors_fatal 34296 1726855349.19026: done checking for any_errors_fatal 34296 1726855349.19027: checking for max_fail_percentage 34296 1726855349.19029: done checking for max_fail_percentage 34296 1726855349.19030: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.19031: done checking to see if all hosts have failed 34296 1726855349.19032: getting the remaining hosts for this loop 34296 1726855349.19034: done getting the remaining hosts for this loop 34296 1726855349.19038: getting the next task for host managed_node1 34296 1726855349.19046: done getting next task for host managed_node1 34296 1726855349.19051: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34296 1726855349.19057: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.19080: getting variables 34296 1726855349.19082: in VariableManager get_vars() 34296 1726855349.19128: Calling all_inventory to load vars for managed_node1 34296 1726855349.19131: Calling groups_inventory to load vars for managed_node1 34296 1726855349.19133: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.19143: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.19146: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.19148: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.19716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.20284: done with get_vars() 34296 1726855349.20391: done getting variables 34296 1726855349.20452: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 14:02:29 -0400 (0:00:00.054) 0:00:05.242 ****** 34296 1726855349.20605: entering _queue_task() for managed_node1/package 34296 1726855349.21234: worker is 1 (out of 1 available) 34296 1726855349.21284: exiting _queue_task() for managed_node1/package 34296 1726855349.21299: done queuing things up, now waiting for results queue to drain 34296 1726855349.21301: waiting for pending results... 34296 1726855349.22523: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34296 1726855349.22530: in run() - task 0affcc66-ac2b-a97a-1acc-000000000107 34296 1726855349.22534: variable 'ansible_search_path' from source: unknown 34296 1726855349.22537: variable 'ansible_search_path' from source: unknown 34296 1726855349.22835: calling self._execute() 34296 1726855349.23198: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.23202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.23205: variable 'omit' from source: magic vars 34296 1726855349.24193: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.24197: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.24199: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.24201: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.24204: when evaluation is False, skipping this task 34296 1726855349.24206: _execute() done 34296 1726855349.24209: dumping result to json 34296 1726855349.24211: done dumping result, returning 34296 1726855349.24213: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcc66-ac2b-a97a-1acc-000000000107] 34296 1726855349.24216: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000107 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855349.24548: no more pending results, returning what we have 34296 1726855349.24552: results queue empty 34296 1726855349.24553: checking for any_errors_fatal 34296 1726855349.24560: done checking for any_errors_fatal 34296 1726855349.24561: checking for max_fail_percentage 34296 1726855349.24563: done checking for max_fail_percentage 34296 1726855349.24563: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.24564: done checking to see if all hosts have failed 34296 1726855349.24567: getting the remaining hosts for this loop 34296 1726855349.24568: done getting the remaining hosts for this loop 34296 1726855349.24573: getting the next task for host managed_node1 34296 1726855349.24580: done getting next task for host managed_node1 34296 1726855349.24584: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34296 1726855349.24590: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.24608: getting variables 34296 1726855349.24610: in VariableManager get_vars() 34296 1726855349.24658: Calling all_inventory to load vars for managed_node1 34296 1726855349.24661: Calling groups_inventory to load vars for managed_node1 34296 1726855349.24663: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.24675: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.24678: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.24680: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.25243: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000107 34296 1726855349.25247: WORKER PROCESS EXITING 34296 1726855349.25384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.25826: done with get_vars() 34296 1726855349.25838: done getting variables 34296 1726855349.26024: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 14:02:29 -0400 (0:00:00.055) 0:00:05.298 ****** 34296 1726855349.26060: entering _queue_task() for managed_node1/service 34296 1726855349.26895: worker is 1 (out of 1 available) 34296 1726855349.26911: exiting _queue_task() for managed_node1/service 34296 1726855349.26924: done queuing things up, now waiting for results queue to drain 34296 1726855349.26926: waiting for pending results... 34296 1726855349.27362: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34296 1726855349.27486: in run() - task 0affcc66-ac2b-a97a-1acc-000000000108 34296 1726855349.27893: variable 'ansible_search_path' from source: unknown 34296 1726855349.27897: variable 'ansible_search_path' from source: unknown 34296 1726855349.27900: calling self._execute() 34296 1726855349.27902: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.27904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.27907: variable 'omit' from source: magic vars 34296 1726855349.28628: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.28739: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.29050: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.29065: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.29073: when evaluation is False, skipping this task 34296 1726855349.29081: _execute() done 34296 1726855349.29090: dumping result to json 34296 1726855349.29097: done dumping result, returning 34296 1726855349.29110: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcc66-ac2b-a97a-1acc-000000000108] 34296 1726855349.29121: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000108 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855349.29293: no more pending results, returning what we have 34296 1726855349.29297: results queue empty 34296 1726855349.29298: checking for any_errors_fatal 34296 1726855349.29306: done checking for any_errors_fatal 34296 1726855349.29306: checking for max_fail_percentage 34296 1726855349.29308: done checking for max_fail_percentage 34296 1726855349.29309: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.29310: done checking to see if all hosts have failed 34296 1726855349.29310: getting the remaining hosts for this loop 34296 1726855349.29312: done getting the remaining hosts for this loop 34296 1726855349.29316: getting the next task for host managed_node1 34296 1726855349.29324: done getting next task for host managed_node1 34296 1726855349.29328: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34296 1726855349.29333: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.29357: getting variables 34296 1726855349.29359: in VariableManager get_vars() 34296 1726855349.29647: Calling all_inventory to load vars for managed_node1 34296 1726855349.29650: Calling groups_inventory to load vars for managed_node1 34296 1726855349.29652: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.29663: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.29668: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.29672: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.30403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.30647: done with get_vars() 34296 1726855349.30660: done getting variables 34296 1726855349.30822: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000108 34296 1726855349.30826: WORKER PROCESS EXITING 34296 1726855349.30895: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 14:02:29 -0400 (0:00:00.049) 0:00:05.347 ****** 34296 1726855349.31004: entering _queue_task() for managed_node1/service 34296 1726855349.31891: worker is 1 (out of 1 available) 34296 1726855349.31902: exiting _queue_task() for managed_node1/service 34296 1726855349.31911: done queuing things up, now waiting for results queue to drain 34296 1726855349.31912: waiting for pending results... 34296 1726855349.32116: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34296 1726855349.32311: in run() - task 0affcc66-ac2b-a97a-1acc-000000000109 34296 1726855349.32333: variable 'ansible_search_path' from source: unknown 34296 1726855349.32342: variable 'ansible_search_path' from source: unknown 34296 1726855349.32389: calling self._execute() 34296 1726855349.32485: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.32499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.32514: variable 'omit' from source: magic vars 34296 1726855349.33025: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.33042: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.33189: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.33201: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.33209: when evaluation is False, skipping this task 34296 1726855349.33220: _execute() done 34296 1726855349.33227: dumping result to json 34296 1726855349.33234: done dumping result, returning 34296 1726855349.33244: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcc66-ac2b-a97a-1acc-000000000109] 34296 1726855349.33253: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000109 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34296 1726855349.33397: no more pending results, returning what we have 34296 1726855349.33401: results queue empty 34296 1726855349.33402: checking for any_errors_fatal 34296 1726855349.33408: done checking for any_errors_fatal 34296 1726855349.33409: checking for max_fail_percentage 34296 1726855349.33411: done checking for max_fail_percentage 34296 1726855349.33412: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.33412: done checking to see if all hosts have failed 34296 1726855349.33413: getting the remaining hosts for this loop 34296 1726855349.33415: done getting the remaining hosts for this loop 34296 1726855349.33418: getting the next task for host managed_node1 34296 1726855349.33426: done getting next task for host managed_node1 34296 1726855349.33430: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34296 1726855349.33434: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.33456: getting variables 34296 1726855349.33458: in VariableManager get_vars() 34296 1726855349.33509: Calling all_inventory to load vars for managed_node1 34296 1726855349.33512: Calling groups_inventory to load vars for managed_node1 34296 1726855349.33515: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.33526: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.33528: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.33531: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.33958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.34359: done with get_vars() 34296 1726855349.34371: done getting variables 34296 1726855349.34425: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000109 34296 1726855349.34428: WORKER PROCESS EXITING 34296 1726855349.34470: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 14:02:29 -0400 (0:00:00.035) 0:00:05.382 ****** 34296 1726855349.34506: entering _queue_task() for managed_node1/service 34296 1726855349.35022: worker is 1 (out of 1 available) 34296 1726855349.35030: exiting _queue_task() for managed_node1/service 34296 1726855349.35039: done queuing things up, now waiting for results queue to drain 34296 1726855349.35041: waiting for pending results... 34296 1726855349.35079: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34296 1726855349.35218: in run() - task 0affcc66-ac2b-a97a-1acc-00000000010a 34296 1726855349.35239: variable 'ansible_search_path' from source: unknown 34296 1726855349.35248: variable 'ansible_search_path' from source: unknown 34296 1726855349.35293: calling self._execute() 34296 1726855349.35381: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.35396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.35410: variable 'omit' from source: magic vars 34296 1726855349.35768: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.35786: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.35915: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.35925: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.35932: when evaluation is False, skipping this task 34296 1726855349.35938: _execute() done 34296 1726855349.35944: dumping result to json 34296 1726855349.35951: done dumping result, returning 34296 1726855349.35961: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcc66-ac2b-a97a-1acc-00000000010a] 34296 1726855349.36010: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000010a 34296 1726855349.36156: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000010a skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855349.36236: no more pending results, returning what we have 34296 1726855349.36240: results queue empty 34296 1726855349.36241: checking for any_errors_fatal 34296 1726855349.36247: done checking for any_errors_fatal 34296 1726855349.36248: checking for max_fail_percentage 34296 1726855349.36250: done checking for max_fail_percentage 34296 1726855349.36250: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.36251: done checking to see if all hosts have failed 34296 1726855349.36252: getting the remaining hosts for this loop 34296 1726855349.36254: done getting the remaining hosts for this loop 34296 1726855349.36257: getting the next task for host managed_node1 34296 1726855349.36266: done getting next task for host managed_node1 34296 1726855349.36270: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34296 1726855349.36274: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.36297: getting variables 34296 1726855349.36299: in VariableManager get_vars() 34296 1726855349.36346: Calling all_inventory to load vars for managed_node1 34296 1726855349.36349: Calling groups_inventory to load vars for managed_node1 34296 1726855349.36351: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.36363: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.36366: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.36369: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.36990: WORKER PROCESS EXITING 34296 1726855349.36999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.37237: done with get_vars() 34296 1726855349.37249: done getting variables 34296 1726855349.37307: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 14:02:29 -0400 (0:00:00.028) 0:00:05.411 ****** 34296 1726855349.37338: entering _queue_task() for managed_node1/service 34296 1726855349.37802: worker is 1 (out of 1 available) 34296 1726855349.37814: exiting _queue_task() for managed_node1/service 34296 1726855349.37823: done queuing things up, now waiting for results queue to drain 34296 1726855349.37824: waiting for pending results... 34296 1726855349.37895: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 34296 1726855349.38028: in run() - task 0affcc66-ac2b-a97a-1acc-00000000010b 34296 1726855349.38052: variable 'ansible_search_path' from source: unknown 34296 1726855349.38060: variable 'ansible_search_path' from source: unknown 34296 1726855349.38102: calling self._execute() 34296 1726855349.38195: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.38207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.38220: variable 'omit' from source: magic vars 34296 1726855349.38656: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.38709: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.38893: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.38923: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.38931: when evaluation is False, skipping this task 34296 1726855349.38938: _execute() done 34296 1726855349.38945: dumping result to json 34296 1726855349.38952: done dumping result, returning 34296 1726855349.38962: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcc66-ac2b-a97a-1acc-00000000010b] 34296 1726855349.38972: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000010b 34296 1726855349.39242: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000010b 34296 1726855349.39245: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34296 1726855349.39294: no more pending results, returning what we have 34296 1726855349.39297: results queue empty 34296 1726855349.39298: checking for any_errors_fatal 34296 1726855349.39305: done checking for any_errors_fatal 34296 1726855349.39306: checking for max_fail_percentage 34296 1726855349.39307: done checking for max_fail_percentage 34296 1726855349.39308: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.39309: done checking to see if all hosts have failed 34296 1726855349.39310: getting the remaining hosts for this loop 34296 1726855349.39311: done getting the remaining hosts for this loop 34296 1726855349.39315: getting the next task for host managed_node1 34296 1726855349.39321: done getting next task for host managed_node1 34296 1726855349.39325: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34296 1726855349.39330: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.39349: getting variables 34296 1726855349.39351: in VariableManager get_vars() 34296 1726855349.39402: Calling all_inventory to load vars for managed_node1 34296 1726855349.39405: Calling groups_inventory to load vars for managed_node1 34296 1726855349.39407: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.39418: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.39420: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.39423: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.39695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.39897: done with get_vars() 34296 1726855349.39907: done getting variables 34296 1726855349.39964: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 14:02:29 -0400 (0:00:00.026) 0:00:05.437 ****** 34296 1726855349.39997: entering _queue_task() for managed_node1/copy 34296 1726855349.40266: worker is 1 (out of 1 available) 34296 1726855349.40279: exiting _queue_task() for managed_node1/copy 34296 1726855349.40292: done queuing things up, now waiting for results queue to drain 34296 1726855349.40294: waiting for pending results... 34296 1726855349.40607: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34296 1726855349.40686: in run() - task 0affcc66-ac2b-a97a-1acc-00000000010c 34296 1726855349.40708: variable 'ansible_search_path' from source: unknown 34296 1726855349.40794: variable 'ansible_search_path' from source: unknown 34296 1726855349.40797: calling self._execute() 34296 1726855349.40883: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.40901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.41006: variable 'omit' from source: magic vars 34296 1726855349.41720: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.41899: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.41968: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.41980: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.41991: when evaluation is False, skipping this task 34296 1726855349.42050: _execute() done 34296 1726855349.42060: dumping result to json 34296 1726855349.42069: done dumping result, returning 34296 1726855349.42122: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcc66-ac2b-a97a-1acc-00000000010c] 34296 1726855349.42132: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000010c 34296 1726855349.42494: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000010c 34296 1726855349.42497: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855349.42547: no more pending results, returning what we have 34296 1726855349.42551: results queue empty 34296 1726855349.42552: checking for any_errors_fatal 34296 1726855349.42557: done checking for any_errors_fatal 34296 1726855349.42558: checking for max_fail_percentage 34296 1726855349.42559: done checking for max_fail_percentage 34296 1726855349.42560: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.42561: done checking to see if all hosts have failed 34296 1726855349.42561: getting the remaining hosts for this loop 34296 1726855349.42563: done getting the remaining hosts for this loop 34296 1726855349.42566: getting the next task for host managed_node1 34296 1726855349.42573: done getting next task for host managed_node1 34296 1726855349.42577: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34296 1726855349.42583: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.42605: getting variables 34296 1726855349.42607: in VariableManager get_vars() 34296 1726855349.42658: Calling all_inventory to load vars for managed_node1 34296 1726855349.42662: Calling groups_inventory to load vars for managed_node1 34296 1726855349.42664: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.42675: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.42679: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.42682: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.43021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.43222: done with get_vars() 34296 1726855349.43233: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 14:02:29 -0400 (0:00:00.033) 0:00:05.470 ****** 34296 1726855349.43316: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34296 1726855349.43619: worker is 1 (out of 1 available) 34296 1726855349.43632: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34296 1726855349.43644: done queuing things up, now waiting for results queue to drain 34296 1726855349.43645: waiting for pending results... 34296 1726855349.44112: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34296 1726855349.44118: in run() - task 0affcc66-ac2b-a97a-1acc-00000000010d 34296 1726855349.44154: variable 'ansible_search_path' from source: unknown 34296 1726855349.44162: variable 'ansible_search_path' from source: unknown 34296 1726855349.44250: calling self._execute() 34296 1726855349.44343: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.44424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.44466: variable 'omit' from source: magic vars 34296 1726855349.45214: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.45239: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.45359: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.45370: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.45379: when evaluation is False, skipping this task 34296 1726855349.45386: _execute() done 34296 1726855349.45397: dumping result to json 34296 1726855349.45447: done dumping result, returning 34296 1726855349.45450: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcc66-ac2b-a97a-1acc-00000000010d] 34296 1726855349.45453: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000010d skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855349.45605: no more pending results, returning what we have 34296 1726855349.45609: results queue empty 34296 1726855349.45610: checking for any_errors_fatal 34296 1726855349.45617: done checking for any_errors_fatal 34296 1726855349.45618: checking for max_fail_percentage 34296 1726855349.45619: done checking for max_fail_percentage 34296 1726855349.45620: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.45621: done checking to see if all hosts have failed 34296 1726855349.45622: getting the remaining hosts for this loop 34296 1726855349.45623: done getting the remaining hosts for this loop 34296 1726855349.45627: getting the next task for host managed_node1 34296 1726855349.45633: done getting next task for host managed_node1 34296 1726855349.45637: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34296 1726855349.45641: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.45663: getting variables 34296 1726855349.45665: in VariableManager get_vars() 34296 1726855349.45818: Calling all_inventory to load vars for managed_node1 34296 1726855349.45821: Calling groups_inventory to load vars for managed_node1 34296 1726855349.45824: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.45836: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.45838: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.45841: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.46238: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000010d 34296 1726855349.46241: WORKER PROCESS EXITING 34296 1726855349.46326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.46831: done with get_vars() 34296 1726855349.46843: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 14:02:29 -0400 (0:00:00.037) 0:00:05.508 ****** 34296 1726855349.47045: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34296 1726855349.47657: worker is 1 (out of 1 available) 34296 1726855349.47672: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34296 1726855349.47684: done queuing things up, now waiting for results queue to drain 34296 1726855349.47685: waiting for pending results... 34296 1726855349.48115: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 34296 1726855349.48449: in run() - task 0affcc66-ac2b-a97a-1acc-00000000010e 34296 1726855349.48452: variable 'ansible_search_path' from source: unknown 34296 1726855349.48455: variable 'ansible_search_path' from source: unknown 34296 1726855349.48461: calling self._execute() 34296 1726855349.48548: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.48582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.48665: variable 'omit' from source: magic vars 34296 1726855349.49500: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.49504: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.49707: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.49719: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.49726: when evaluation is False, skipping this task 34296 1726855349.49731: _execute() done 34296 1726855349.49751: dumping result to json 34296 1726855349.49759: done dumping result, returning 34296 1726855349.49803: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcc66-ac2b-a97a-1acc-00000000010e] 34296 1726855349.49813: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000010e 34296 1726855349.50163: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000010e 34296 1726855349.50166: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855349.50220: no more pending results, returning what we have 34296 1726855349.50224: results queue empty 34296 1726855349.50225: checking for any_errors_fatal 34296 1726855349.50234: done checking for any_errors_fatal 34296 1726855349.50235: checking for max_fail_percentage 34296 1726855349.50237: done checking for max_fail_percentage 34296 1726855349.50238: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.50238: done checking to see if all hosts have failed 34296 1726855349.50239: getting the remaining hosts for this loop 34296 1726855349.50240: done getting the remaining hosts for this loop 34296 1726855349.50244: getting the next task for host managed_node1 34296 1726855349.50252: done getting next task for host managed_node1 34296 1726855349.50256: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34296 1726855349.50260: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.50284: getting variables 34296 1726855349.50285: in VariableManager get_vars() 34296 1726855349.50334: Calling all_inventory to load vars for managed_node1 34296 1726855349.50337: Calling groups_inventory to load vars for managed_node1 34296 1726855349.50339: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.50350: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.50353: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.50355: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.50959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.51235: done with get_vars() 34296 1726855349.51246: done getting variables 34296 1726855349.51306: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 14:02:29 -0400 (0:00:00.042) 0:00:05.551 ****** 34296 1726855349.51339: entering _queue_task() for managed_node1/debug 34296 1726855349.51643: worker is 1 (out of 1 available) 34296 1726855349.51659: exiting _queue_task() for managed_node1/debug 34296 1726855349.51670: done queuing things up, now waiting for results queue to drain 34296 1726855349.51672: waiting for pending results... 34296 1726855349.51898: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34296 1726855349.52015: in run() - task 0affcc66-ac2b-a97a-1acc-00000000010f 34296 1726855349.52027: variable 'ansible_search_path' from source: unknown 34296 1726855349.52031: variable 'ansible_search_path' from source: unknown 34296 1726855349.52076: calling self._execute() 34296 1726855349.52162: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.52166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.52180: variable 'omit' from source: magic vars 34296 1726855349.52554: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.52644: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.52686: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.52700: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.52703: when evaluation is False, skipping this task 34296 1726855349.52705: _execute() done 34296 1726855349.52708: dumping result to json 34296 1726855349.52711: done dumping result, returning 34296 1726855349.52720: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcc66-ac2b-a97a-1acc-00000000010f] 34296 1726855349.52726: sending task result for task 0affcc66-ac2b-a97a-1acc-00000000010f 34296 1726855349.52813: done sending task result for task 0affcc66-ac2b-a97a-1acc-00000000010f 34296 1726855349.52816: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855349.52919: no more pending results, returning what we have 34296 1726855349.52922: results queue empty 34296 1726855349.52924: checking for any_errors_fatal 34296 1726855349.52930: done checking for any_errors_fatal 34296 1726855349.52930: checking for max_fail_percentage 34296 1726855349.52932: done checking for max_fail_percentage 34296 1726855349.52932: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.52933: done checking to see if all hosts have failed 34296 1726855349.52934: getting the remaining hosts for this loop 34296 1726855349.52935: done getting the remaining hosts for this loop 34296 1726855349.52938: getting the next task for host managed_node1 34296 1726855349.52944: done getting next task for host managed_node1 34296 1726855349.52947: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34296 1726855349.52951: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.52967: getting variables 34296 1726855349.52968: in VariableManager get_vars() 34296 1726855349.53008: Calling all_inventory to load vars for managed_node1 34296 1726855349.53010: Calling groups_inventory to load vars for managed_node1 34296 1726855349.53012: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.53021: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.53023: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.53026: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.53540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.54050: done with get_vars() 34296 1726855349.54062: done getting variables 34296 1726855349.54125: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 14:02:29 -0400 (0:00:00.028) 0:00:05.579 ****** 34296 1726855349.54159: entering _queue_task() for managed_node1/debug 34296 1726855349.54740: worker is 1 (out of 1 available) 34296 1726855349.54803: exiting _queue_task() for managed_node1/debug 34296 1726855349.54814: done queuing things up, now waiting for results queue to drain 34296 1726855349.54815: waiting for pending results... 34296 1726855349.55005: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34296 1726855349.55129: in run() - task 0affcc66-ac2b-a97a-1acc-000000000110 34296 1726855349.55150: variable 'ansible_search_path' from source: unknown 34296 1726855349.55158: variable 'ansible_search_path' from source: unknown 34296 1726855349.55201: calling self._execute() 34296 1726855349.55320: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.55324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.55327: variable 'omit' from source: magic vars 34296 1726855349.55697: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.55756: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.55843: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.55856: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.55871: when evaluation is False, skipping this task 34296 1726855349.55878: _execute() done 34296 1726855349.55885: dumping result to json 34296 1726855349.55973: done dumping result, returning 34296 1726855349.55977: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcc66-ac2b-a97a-1acc-000000000110] 34296 1726855349.55979: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000110 34296 1726855349.56046: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000110 34296 1726855349.56052: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855349.56108: no more pending results, returning what we have 34296 1726855349.56111: results queue empty 34296 1726855349.56113: checking for any_errors_fatal 34296 1726855349.56118: done checking for any_errors_fatal 34296 1726855349.56119: checking for max_fail_percentage 34296 1726855349.56121: done checking for max_fail_percentage 34296 1726855349.56122: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.56123: done checking to see if all hosts have failed 34296 1726855349.56124: getting the remaining hosts for this loop 34296 1726855349.56125: done getting the remaining hosts for this loop 34296 1726855349.56129: getting the next task for host managed_node1 34296 1726855349.56136: done getting next task for host managed_node1 34296 1726855349.56140: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34296 1726855349.56145: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.56168: getting variables 34296 1726855349.56170: in VariableManager get_vars() 34296 1726855349.56221: Calling all_inventory to load vars for managed_node1 34296 1726855349.56224: Calling groups_inventory to load vars for managed_node1 34296 1726855349.56226: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.56238: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.56241: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.56244: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.56679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.56929: done with get_vars() 34296 1726855349.56939: done getting variables 34296 1726855349.57001: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 14:02:29 -0400 (0:00:00.028) 0:00:05.608 ****** 34296 1726855349.57033: entering _queue_task() for managed_node1/debug 34296 1726855349.57309: worker is 1 (out of 1 available) 34296 1726855349.57323: exiting _queue_task() for managed_node1/debug 34296 1726855349.57334: done queuing things up, now waiting for results queue to drain 34296 1726855349.57335: waiting for pending results... 34296 1726855349.57707: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34296 1726855349.57731: in run() - task 0affcc66-ac2b-a97a-1acc-000000000111 34296 1726855349.57748: variable 'ansible_search_path' from source: unknown 34296 1726855349.57754: variable 'ansible_search_path' from source: unknown 34296 1726855349.57793: calling self._execute() 34296 1726855349.57880: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.57893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.57907: variable 'omit' from source: magic vars 34296 1726855349.58262: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.58283: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.58455: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.58458: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.58461: when evaluation is False, skipping this task 34296 1726855349.58463: _execute() done 34296 1726855349.58468: dumping result to json 34296 1726855349.58470: done dumping result, returning 34296 1726855349.58472: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcc66-ac2b-a97a-1acc-000000000111] 34296 1726855349.58474: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000111 skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34296 1726855349.58608: no more pending results, returning what we have 34296 1726855349.58611: results queue empty 34296 1726855349.58612: checking for any_errors_fatal 34296 1726855349.58620: done checking for any_errors_fatal 34296 1726855349.58621: checking for max_fail_percentage 34296 1726855349.58622: done checking for max_fail_percentage 34296 1726855349.58623: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.58624: done checking to see if all hosts have failed 34296 1726855349.58624: getting the remaining hosts for this loop 34296 1726855349.58626: done getting the remaining hosts for this loop 34296 1726855349.58629: getting the next task for host managed_node1 34296 1726855349.58637: done getting next task for host managed_node1 34296 1726855349.58640: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34296 1726855349.58644: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.58664: getting variables 34296 1726855349.58668: in VariableManager get_vars() 34296 1726855349.58717: Calling all_inventory to load vars for managed_node1 34296 1726855349.58720: Calling groups_inventory to load vars for managed_node1 34296 1726855349.58723: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.58734: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.58737: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.58740: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.59162: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000111 34296 1726855349.59168: WORKER PROCESS EXITING 34296 1726855349.59193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.59391: done with get_vars() 34296 1726855349.59401: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 14:02:29 -0400 (0:00:00.024) 0:00:05.632 ****** 34296 1726855349.59496: entering _queue_task() for managed_node1/ping 34296 1726855349.59740: worker is 1 (out of 1 available) 34296 1726855349.59754: exiting _queue_task() for managed_node1/ping 34296 1726855349.59767: done queuing things up, now waiting for results queue to drain 34296 1726855349.59769: waiting for pending results... 34296 1726855349.60017: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 34296 1726855349.60144: in run() - task 0affcc66-ac2b-a97a-1acc-000000000112 34296 1726855349.60162: variable 'ansible_search_path' from source: unknown 34296 1726855349.60174: variable 'ansible_search_path' from source: unknown 34296 1726855349.60217: calling self._execute() 34296 1726855349.60299: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.60312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.60329: variable 'omit' from source: magic vars 34296 1726855349.60682: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.60699: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.60810: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.60819: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.60825: when evaluation is False, skipping this task 34296 1726855349.60830: _execute() done 34296 1726855349.60836: dumping result to json 34296 1726855349.60841: done dumping result, returning 34296 1726855349.60851: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcc66-ac2b-a97a-1acc-000000000112] 34296 1726855349.60864: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000112 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855349.61009: no more pending results, returning what we have 34296 1726855349.61012: results queue empty 34296 1726855349.61013: checking for any_errors_fatal 34296 1726855349.61019: done checking for any_errors_fatal 34296 1726855349.61020: checking for max_fail_percentage 34296 1726855349.61021: done checking for max_fail_percentage 34296 1726855349.61022: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.61023: done checking to see if all hosts have failed 34296 1726855349.61024: getting the remaining hosts for this loop 34296 1726855349.61025: done getting the remaining hosts for this loop 34296 1726855349.61029: getting the next task for host managed_node1 34296 1726855349.61039: done getting next task for host managed_node1 34296 1726855349.61041: ^ task is: TASK: meta (role_complete) 34296 1726855349.61045: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.61068: getting variables 34296 1726855349.61070: in VariableManager get_vars() 34296 1726855349.61117: Calling all_inventory to load vars for managed_node1 34296 1726855349.61120: Calling groups_inventory to load vars for managed_node1 34296 1726855349.61122: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.61133: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.61135: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.61138: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.61573: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000112 34296 1726855349.61576: WORKER PROCESS EXITING 34296 1726855349.61599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.61795: done with get_vars() 34296 1726855349.61804: done getting variables 34296 1726855349.61877: done queuing things up, now waiting for results queue to drain 34296 1726855349.61879: results queue empty 34296 1726855349.61880: checking for any_errors_fatal 34296 1726855349.61882: done checking for any_errors_fatal 34296 1726855349.61882: checking for max_fail_percentage 34296 1726855349.61883: done checking for max_fail_percentage 34296 1726855349.61884: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.61885: done checking to see if all hosts have failed 34296 1726855349.61885: getting the remaining hosts for this loop 34296 1726855349.61886: done getting the remaining hosts for this loop 34296 1726855349.61890: getting the next task for host managed_node1 34296 1726855349.61894: done getting next task for host managed_node1 34296 1726855349.61896: ^ task is: TASK: Include the task 'cleanup_mock_wifi.yml' 34296 1726855349.61898: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.61900: getting variables 34296 1726855349.61901: in VariableManager get_vars() 34296 1726855349.61918: Calling all_inventory to load vars for managed_node1 34296 1726855349.61920: Calling groups_inventory to load vars for managed_node1 34296 1726855349.61922: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.61927: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.61929: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.61931: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.62075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.62269: done with get_vars() 34296 1726855349.62277: done getting variables TASK [Include the task 'cleanup_mock_wifi.yml'] ******************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:96 Friday 20 September 2024 14:02:29 -0400 (0:00:00.028) 0:00:05.661 ****** 34296 1726855349.62345: entering _queue_task() for managed_node1/include_tasks 34296 1726855349.63001: worker is 1 (out of 1 available) 34296 1726855349.63011: exiting _queue_task() for managed_node1/include_tasks 34296 1726855349.63022: done queuing things up, now waiting for results queue to drain 34296 1726855349.63023: waiting for pending results... 34296 1726855349.63085: running TaskExecutor() for managed_node1/TASK: Include the task 'cleanup_mock_wifi.yml' 34296 1726855349.63472: in run() - task 0affcc66-ac2b-a97a-1acc-000000000142 34296 1726855349.63482: variable 'ansible_search_path' from source: unknown 34296 1726855349.63523: calling self._execute() 34296 1726855349.63723: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.63729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.63741: variable 'omit' from source: magic vars 34296 1726855349.64594: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.64598: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.64802: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.64817: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.64827: when evaluation is False, skipping this task 34296 1726855349.64834: _execute() done 34296 1726855349.64841: dumping result to json 34296 1726855349.64849: done dumping result, returning 34296 1726855349.64860: done running TaskExecutor() for managed_node1/TASK: Include the task 'cleanup_mock_wifi.yml' [0affcc66-ac2b-a97a-1acc-000000000142] 34296 1726855349.64933: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000142 34296 1726855349.65127: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000142 34296 1726855349.65130: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855349.65205: no more pending results, returning what we have 34296 1726855349.65209: results queue empty 34296 1726855349.65210: checking for any_errors_fatal 34296 1726855349.65213: done checking for any_errors_fatal 34296 1726855349.65213: checking for max_fail_percentage 34296 1726855349.65215: done checking for max_fail_percentage 34296 1726855349.65216: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.65217: done checking to see if all hosts have failed 34296 1726855349.65218: getting the remaining hosts for this loop 34296 1726855349.65220: done getting the remaining hosts for this loop 34296 1726855349.65225: getting the next task for host managed_node1 34296 1726855349.65231: done getting next task for host managed_node1 34296 1726855349.65234: ^ task is: TASK: Verify network state restored to default 34296 1726855349.65238: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34296 1726855349.65242: getting variables 34296 1726855349.65244: in VariableManager get_vars() 34296 1726855349.65301: Calling all_inventory to load vars for managed_node1 34296 1726855349.65304: Calling groups_inventory to load vars for managed_node1 34296 1726855349.65307: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.65320: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.65322: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.65325: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.66706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.67064: done with get_vars() 34296 1726855349.67077: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:98 Friday 20 September 2024 14:02:29 -0400 (0:00:00.048) 0:00:05.709 ****** 34296 1726855349.67169: entering _queue_task() for managed_node1/include_tasks 34296 1726855349.67886: worker is 1 (out of 1 available) 34296 1726855349.67900: exiting _queue_task() for managed_node1/include_tasks 34296 1726855349.67910: done queuing things up, now waiting for results queue to drain 34296 1726855349.67912: waiting for pending results... 34296 1726855349.68307: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 34296 1726855349.68333: in run() - task 0affcc66-ac2b-a97a-1acc-000000000143 34296 1726855349.68468: variable 'ansible_search_path' from source: unknown 34296 1726855349.68572: calling self._execute() 34296 1726855349.68661: variable 'ansible_host' from source: host vars for 'managed_node1' 34296 1726855349.68789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34296 1726855349.68863: variable 'omit' from source: magic vars 34296 1726855349.69491: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.69504: Evaluated conditional (ansible_distribution_major_version != '6'): True 34296 1726855349.69636: variable 'ansible_distribution_major_version' from source: facts 34296 1726855349.69641: Evaluated conditional (ansible_distribution_major_version == '7'): False 34296 1726855349.69652: when evaluation is False, skipping this task 34296 1726855349.69656: _execute() done 34296 1726855349.69658: dumping result to json 34296 1726855349.69661: done dumping result, returning 34296 1726855349.69670: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0affcc66-ac2b-a97a-1acc-000000000143] 34296 1726855349.69673: sending task result for task 0affcc66-ac2b-a97a-1acc-000000000143 34296 1726855349.69934: done sending task result for task 0affcc66-ac2b-a97a-1acc-000000000143 34296 1726855349.69937: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34296 1726855349.69975: no more pending results, returning what we have 34296 1726855349.69979: results queue empty 34296 1726855349.69979: checking for any_errors_fatal 34296 1726855349.69988: done checking for any_errors_fatal 34296 1726855349.69989: checking for max_fail_percentage 34296 1726855349.69990: done checking for max_fail_percentage 34296 1726855349.69991: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.69992: done checking to see if all hosts have failed 34296 1726855349.69992: getting the remaining hosts for this loop 34296 1726855349.69994: done getting the remaining hosts for this loop 34296 1726855349.69997: getting the next task for host managed_node1 34296 1726855349.70004: done getting next task for host managed_node1 34296 1726855349.70005: ^ task is: TASK: meta (flush_handlers) 34296 1726855349.70008: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855349.70011: getting variables 34296 1726855349.70013: in VariableManager get_vars() 34296 1726855349.70056: Calling all_inventory to load vars for managed_node1 34296 1726855349.70059: Calling groups_inventory to load vars for managed_node1 34296 1726855349.70061: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.70074: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.70077: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.70080: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.70267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.70474: done with get_vars() 34296 1726855349.70485: done getting variables 34296 1726855349.70555: in VariableManager get_vars() 34296 1726855349.70577: Calling all_inventory to load vars for managed_node1 34296 1726855349.70580: Calling groups_inventory to load vars for managed_node1 34296 1726855349.70582: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.70589: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.70592: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.70595: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.70742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.70981: done with get_vars() 34296 1726855349.70998: done queuing things up, now waiting for results queue to drain 34296 1726855349.71000: results queue empty 34296 1726855349.71001: checking for any_errors_fatal 34296 1726855349.71004: done checking for any_errors_fatal 34296 1726855349.71004: checking for max_fail_percentage 34296 1726855349.71005: done checking for max_fail_percentage 34296 1726855349.71006: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.71007: done checking to see if all hosts have failed 34296 1726855349.71008: getting the remaining hosts for this loop 34296 1726855349.71009: done getting the remaining hosts for this loop 34296 1726855349.71011: getting the next task for host managed_node1 34296 1726855349.71016: done getting next task for host managed_node1 34296 1726855349.71017: ^ task is: TASK: meta (flush_handlers) 34296 1726855349.71019: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855349.71022: getting variables 34296 1726855349.71023: in VariableManager get_vars() 34296 1726855349.71040: Calling all_inventory to load vars for managed_node1 34296 1726855349.71043: Calling groups_inventory to load vars for managed_node1 34296 1726855349.71045: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.71050: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.71052: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.71055: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.71197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.71375: done with get_vars() 34296 1726855349.71384: done getting variables 34296 1726855349.71432: in VariableManager get_vars() 34296 1726855349.71447: Calling all_inventory to load vars for managed_node1 34296 1726855349.71450: Calling groups_inventory to load vars for managed_node1 34296 1726855349.71452: Calling all_plugins_inventory to load vars for managed_node1 34296 1726855349.71457: Calling all_plugins_play to load vars for managed_node1 34296 1726855349.71459: Calling groups_plugins_inventory to load vars for managed_node1 34296 1726855349.71461: Calling groups_plugins_play to load vars for managed_node1 34296 1726855349.71599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34296 1726855349.71815: done with get_vars() 34296 1726855349.71827: done queuing things up, now waiting for results queue to drain 34296 1726855349.71829: results queue empty 34296 1726855349.71830: checking for any_errors_fatal 34296 1726855349.71831: done checking for any_errors_fatal 34296 1726855349.71832: checking for max_fail_percentage 34296 1726855349.71832: done checking for max_fail_percentage 34296 1726855349.71833: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.71834: done checking to see if all hosts have failed 34296 1726855349.71835: getting the remaining hosts for this loop 34296 1726855349.71836: done getting the remaining hosts for this loop 34296 1726855349.71844: getting the next task for host managed_node1 34296 1726855349.71847: done getting next task for host managed_node1 34296 1726855349.71848: ^ task is: None 34296 1726855349.71850: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34296 1726855349.71851: done queuing things up, now waiting for results queue to drain 34296 1726855349.71851: results queue empty 34296 1726855349.71852: checking for any_errors_fatal 34296 1726855349.71854: done checking for any_errors_fatal 34296 1726855349.71854: checking for max_fail_percentage 34296 1726855349.71855: done checking for max_fail_percentage 34296 1726855349.71856: checking to see if all hosts have failed and the running result is not ok 34296 1726855349.71857: done checking to see if all hosts have failed 34296 1726855349.71858: getting the next task for host managed_node1 34296 1726855349.71861: done getting next task for host managed_node1 34296 1726855349.71861: ^ task is: None 34296 1726855349.71863: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=7 changed=0 unreachable=0 failed=0 skipped=102 rescued=0 ignored=0 Friday 20 September 2024 14:02:29 -0400 (0:00:00.047) 0:00:05.757 ****** =============================================================================== Gathering Facts --------------------------------------------------------- 1.33s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:6 Gather the minimum subset of ansible_facts required by the network role test --- 0.75s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.51s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Ensure ansible_facts used by role --- 0.06s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable --- 0.06s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable --- 0.05s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Set network provider to 'nm' -------------------------------------------- 0.05s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:13 fedora.linux_system_roles.network : Install packages -------------------- 0.05s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider --- 0.05s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces --- 0.05s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Include the task 'cleanup_mock_wifi.yml' -------------------------------- 0.05s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:96 fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces --- 0.05s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Verify network state restored to default -------------------------------- 0.05s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:98 Include the task 'enable_epel.yml' -------------------------------------- 0.05s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.04s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Print network provider -------------- 0.04s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 fedora.linux_system_roles.network : Ensure initscripts network file dependency is present --- 0.04s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces --- 0.04s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 fedora.linux_system_roles.network : Ensure ansible_facts used by role --- 0.04s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 fedora.linux_system_roles.network : Configure networking state ---------- 0.04s /tmp/collections-ZzD/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 34296 1726855349.71972: RUNNING CLEANUP