[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 34350 1726853745.79379: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Qi7 executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 34350 1726853745.79855: Added group all to inventory 34350 1726853745.79858: Added group ungrouped to inventory 34350 1726853745.79862: Group all now contains ungrouped 34350 1726853745.79865: Examining possible inventory source: /tmp/network-iHm/inventory.yml 34350 1726853745.93202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 34350 1726853745.93244: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 34350 1726853745.93262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 34350 1726853745.93303: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 34350 1726853745.93351: Loaded config def from plugin (inventory/script) 34350 1726853745.93352: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 34350 1726853745.93397: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 34350 1726853745.93476: Loaded config def from plugin (inventory/yaml) 34350 1726853745.93478: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 34350 1726853745.93563: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 34350 1726853745.93987: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 34350 1726853745.93990: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 34350 1726853745.93993: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 34350 1726853745.93999: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 34350 1726853745.94003: Loading data from /tmp/network-iHm/inventory.yml 34350 1726853745.94118: /tmp/network-iHm/inventory.yml was not parsable by auto 34350 1726853745.94194: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 34350 1726853745.94231: Loading data from /tmp/network-iHm/inventory.yml 34350 1726853745.94317: group all already in inventory 34350 1726853745.94323: set inventory_file for managed_node1 34350 1726853745.94327: set inventory_dir for managed_node1 34350 1726853745.94328: Added host managed_node1 to inventory 34350 1726853745.94330: Added host managed_node1 to group all 34350 1726853745.94331: set ansible_host for managed_node1 34350 1726853745.94332: set ansible_ssh_extra_args for managed_node1 34350 1726853745.94335: set inventory_file for managed_node2 34350 1726853745.94337: set inventory_dir for managed_node2 34350 1726853745.94338: Added host managed_node2 to inventory 34350 1726853745.94339: Added host managed_node2 to group all 34350 1726853745.94340: set ansible_host for managed_node2 34350 1726853745.94341: set ansible_ssh_extra_args for managed_node2 34350 1726853745.94343: set inventory_file for managed_node3 34350 1726853745.94345: set inventory_dir for managed_node3 34350 1726853745.94346: Added host managed_node3 to inventory 34350 1726853745.94347: Added host managed_node3 to group all 34350 1726853745.94348: set ansible_host for managed_node3 34350 1726853745.94349: set ansible_ssh_extra_args for managed_node3 34350 1726853745.94351: Reconcile groups and hosts in inventory. 34350 1726853745.94355: Group ungrouped now contains managed_node1 34350 1726853745.94357: Group ungrouped now contains managed_node2 34350 1726853745.94358: Group ungrouped now contains managed_node3 34350 1726853745.94431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 34350 1726853745.94555: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 34350 1726853745.94614: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 34350 1726853745.94642: Loaded config def from plugin (vars/host_group_vars) 34350 1726853745.94644: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 34350 1726853745.94651: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 34350 1726853745.94659: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 34350 1726853745.94705: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 34350 1726853745.94956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853745.95029: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 34350 1726853745.95053: Loaded config def from plugin (connection/local) 34350 1726853745.95055: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 34350 1726853745.95429: Loaded config def from plugin (connection/paramiko_ssh) 34350 1726853745.95431: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 34350 1726853745.96010: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34350 1726853745.96033: Loaded config def from plugin (connection/psrp) 34350 1726853745.96035: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 34350 1726853745.96492: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34350 1726853745.96515: Loaded config def from plugin (connection/ssh) 34350 1726853745.96517: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 34350 1726853745.98433: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 34350 1726853745.98469: Loaded config def from plugin (connection/winrm) 34350 1726853745.98473: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 34350 1726853745.98502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 34350 1726853745.98564: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 34350 1726853745.98628: Loaded config def from plugin (shell/cmd) 34350 1726853745.98630: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 34350 1726853745.98654: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 34350 1726853745.98717: Loaded config def from plugin (shell/powershell) 34350 1726853745.98719: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 34350 1726853745.98767: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 34350 1726853745.98933: Loaded config def from plugin (shell/sh) 34350 1726853745.98935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 34350 1726853745.98966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 34350 1726853745.99084: Loaded config def from plugin (become/runas) 34350 1726853745.99086: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 34350 1726853745.99257: Loaded config def from plugin (become/su) 34350 1726853745.99262: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 34350 1726853745.99445: Loaded config def from plugin (become/sudo) 34350 1726853745.99447: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 34350 1726853745.99480: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml 34350 1726853745.99781: in VariableManager get_vars() 34350 1726853745.99801: done with get_vars() 34350 1726853745.99916: trying /usr/local/lib/python3.12/site-packages/ansible/modules 34350 1726853746.02300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 34350 1726853746.02384: in VariableManager get_vars() 34350 1726853746.02388: done with get_vars() 34350 1726853746.02390: variable 'playbook_dir' from source: magic vars 34350 1726853746.02390: variable 'ansible_playbook_python' from source: magic vars 34350 1726853746.02391: variable 'ansible_config_file' from source: magic vars 34350 1726853746.02391: variable 'groups' from source: magic vars 34350 1726853746.02392: variable 'omit' from source: magic vars 34350 1726853746.02392: variable 'ansible_version' from source: magic vars 34350 1726853746.02392: variable 'ansible_check_mode' from source: magic vars 34350 1726853746.02393: variable 'ansible_diff_mode' from source: magic vars 34350 1726853746.02393: variable 'ansible_forks' from source: magic vars 34350 1726853746.02394: variable 'ansible_inventory_sources' from source: magic vars 34350 1726853746.02394: variable 'ansible_skip_tags' from source: magic vars 34350 1726853746.02395: variable 'ansible_limit' from source: magic vars 34350 1726853746.02395: variable 'ansible_run_tags' from source: magic vars 34350 1726853746.02395: variable 'ansible_verbosity' from source: magic vars 34350 1726853746.02417: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml 34350 1726853746.02767: in VariableManager get_vars() 34350 1726853746.02779: done with get_vars() 34350 1726853746.02861: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 34350 1726853746.02982: in VariableManager get_vars() 34350 1726853746.02991: done with get_vars() 34350 1726853746.02994: variable 'omit' from source: magic vars 34350 1726853746.03005: variable 'omit' from source: magic vars 34350 1726853746.03025: in VariableManager get_vars() 34350 1726853746.03031: done with get_vars() 34350 1726853746.03063: in VariableManager get_vars() 34350 1726853746.03074: done with get_vars() 34350 1726853746.03098: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34350 1726853746.03225: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34350 1726853746.03303: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34350 1726853746.03668: in VariableManager get_vars() 34350 1726853746.03681: done with get_vars() 34350 1726853746.03951: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 34350 1726853746.04035: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34350 1726853746.05046: in VariableManager get_vars() 34350 1726853746.05057: done with get_vars() 34350 1726853746.05062: variable 'omit' from source: magic vars 34350 1726853746.05069: variable 'omit' from source: magic vars 34350 1726853746.05090: in VariableManager get_vars() 34350 1726853746.05110: done with get_vars() 34350 1726853746.05123: in VariableManager get_vars() 34350 1726853746.05132: done with get_vars() 34350 1726853746.05151: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34350 1726853746.05213: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34350 1726853746.05256: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34350 1726853746.06595: in VariableManager get_vars() 34350 1726853746.06609: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34350 1726853746.07896: in VariableManager get_vars() 34350 1726853746.07908: done with get_vars() 34350 1726853746.07911: variable 'omit' from source: magic vars 34350 1726853746.07917: variable 'omit' from source: magic vars 34350 1726853746.07934: in VariableManager get_vars() 34350 1726853746.07943: done with get_vars() 34350 1726853746.07955: in VariableManager get_vars() 34350 1726853746.07967: done with get_vars() 34350 1726853746.07991: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34350 1726853746.08064: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34350 1726853746.08113: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34350 1726853746.08329: in VariableManager get_vars() 34350 1726853746.08343: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34350 1726853746.09561: in VariableManager get_vars() 34350 1726853746.09578: done with get_vars() 34350 1726853746.09581: variable 'omit' from source: magic vars 34350 1726853746.09595: variable 'omit' from source: magic vars 34350 1726853746.09618: in VariableManager get_vars() 34350 1726853746.09629: done with get_vars() 34350 1726853746.09641: in VariableManager get_vars() 34350 1726853746.09653: done with get_vars() 34350 1726853746.09674: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 34350 1726853746.09748: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 34350 1726853746.09793: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 34350 1726853746.10009: in VariableManager get_vars() 34350 1726853746.10024: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34350 1726853746.11209: in VariableManager get_vars() 34350 1726853746.11224: done with get_vars() 34350 1726853746.11249: in VariableManager get_vars() 34350 1726853746.11265: done with get_vars() 34350 1726853746.11305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 34350 1726853746.11318: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 34350 1726853746.11496: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 34350 1726853746.11584: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 34350 1726853746.11586: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 34350 1726853746.11606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 34350 1726853746.11621: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 34350 1726853746.11719: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 34350 1726853746.11752: Loaded config def from plugin (callback/default) 34350 1726853746.11754: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34350 1726853746.12552: Loaded config def from plugin (callback/junit) 34350 1726853746.12554: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34350 1726853746.12587: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 34350 1726853746.12624: Loaded config def from plugin (callback/minimal) 34350 1726853746.12625: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34350 1726853746.12652: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 34350 1726853746.12694: Loaded config def from plugin (callback/tree) 34350 1726853746.12696: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 34350 1726853746.12768: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 34350 1726853746.12770: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_wireless_nm.yml ************************************************ 2 plays in /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml 34350 1726853746.12788: in VariableManager get_vars() 34350 1726853746.12796: done with get_vars() 34350 1726853746.12799: in VariableManager get_vars() 34350 1726853746.12804: done with get_vars() 34350 1726853746.12806: variable 'omit' from source: magic vars 34350 1726853746.12826: in VariableManager get_vars() 34350 1726853746.12835: done with get_vars() 34350 1726853746.12846: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_wireless.yml' with nm as provider] ********* 34350 1726853746.13209: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 34350 1726853746.13256: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 34350 1726853746.13284: getting the remaining hosts for this loop 34350 1726853746.13286: done getting the remaining hosts for this loop 34350 1726853746.13288: getting the next task for host managed_node1 34350 1726853746.13290: done getting next task for host managed_node1 34350 1726853746.13291: ^ task is: TASK: Gathering Facts 34350 1726853746.13292: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853746.13294: getting variables 34350 1726853746.13294: in VariableManager get_vars() 34350 1726853746.13301: Calling all_inventory to load vars for managed_node1 34350 1726853746.13303: Calling groups_inventory to load vars for managed_node1 34350 1726853746.13304: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853746.13314: Calling all_plugins_play to load vars for managed_node1 34350 1726853746.13320: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853746.13322: Calling groups_plugins_play to load vars for managed_node1 34350 1726853746.13343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853746.13388: done with get_vars() 34350 1726853746.13393: done getting variables 34350 1726853746.13451: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:6 Friday 20 September 2024 13:35:46 -0400 (0:00:00.007) 0:00:00.007 ****** 34350 1726853746.13467: entering _queue_task() for managed_node1/gather_facts 34350 1726853746.13468: Creating lock for gather_facts 34350 1726853746.13754: worker is 1 (out of 1 available) 34350 1726853746.13768: exiting _queue_task() for managed_node1/gather_facts 34350 1726853746.13781: done queuing things up, now waiting for results queue to drain 34350 1726853746.13783: waiting for pending results... 34350 1726853746.13915: running TaskExecutor() for managed_node1/TASK: Gathering Facts 34350 1726853746.13966: in run() - task 02083763-bbaf-b6c1-0de4-000000000147 34350 1726853746.13980: variable 'ansible_search_path' from source: unknown 34350 1726853746.14012: calling self._execute() 34350 1726853746.14060: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853746.14064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853746.14073: variable 'omit' from source: magic vars 34350 1726853746.14153: variable 'omit' from source: magic vars 34350 1726853746.14177: variable 'omit' from source: magic vars 34350 1726853746.14199: variable 'omit' from source: magic vars 34350 1726853746.14231: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34350 1726853746.14265: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34350 1726853746.14281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34350 1726853746.14294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34350 1726853746.14303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34350 1726853746.14326: variable 'inventory_hostname' from source: host vars for 'managed_node1' 34350 1726853746.14329: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853746.14332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853746.14420: Set connection var ansible_timeout to 10 34350 1726853746.14424: Set connection var ansible_module_compression to ZIP_DEFLATED 34350 1726853746.14431: Set connection var ansible_pipelining to False 34350 1726853746.14436: Set connection var ansible_shell_executable to /bin/sh 34350 1726853746.14442: Set connection var ansible_connection to ssh 34350 1726853746.14445: Set connection var ansible_shell_type to sh 34350 1726853746.14460: variable 'ansible_shell_executable' from source: unknown 34350 1726853746.14465: variable 'ansible_connection' from source: unknown 34350 1726853746.14468: variable 'ansible_module_compression' from source: unknown 34350 1726853746.14478: variable 'ansible_shell_type' from source: unknown 34350 1726853746.14483: variable 'ansible_shell_executable' from source: unknown 34350 1726853746.14486: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853746.14488: variable 'ansible_pipelining' from source: unknown 34350 1726853746.14490: variable 'ansible_timeout' from source: unknown 34350 1726853746.14492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853746.14630: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34350 1726853746.14639: variable 'omit' from source: magic vars 34350 1726853746.14643: starting attempt loop 34350 1726853746.14646: running the handler 34350 1726853746.14660: variable 'ansible_facts' from source: unknown 34350 1726853746.14675: _low_level_execute_command(): starting 34350 1726853746.14683: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34350 1726853746.15245: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34350 1726853746.15249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853746.15253: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 34350 1726853746.15256: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853746.15273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 34350 1726853746.15285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34350 1726853746.15344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34350 1726853746.17101: stdout chunk (state=3): >>>/root <<< 34350 1726853746.17222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34350 1726853746.17230: stdout chunk (state=3): >>><<< 34350 1726853746.17233: stderr chunk (state=3): >>><<< 34350 1726853746.17352: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34350 1726853746.17363: _low_level_execute_command(): starting 34350 1726853746.17376: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853746.1725833-34369-170982725924458 `" && echo ansible-tmp-1726853746.1725833-34369-170982725924458="` echo /root/.ansible/tmp/ansible-tmp-1726853746.1725833-34369-170982725924458 `" ) && sleep 0' 34350 1726853746.18259: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853746.18347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 34350 1726853746.20252: stdout chunk (state=3): >>>ansible-tmp-1726853746.1725833-34369-170982725924458=/root/.ansible/tmp/ansible-tmp-1726853746.1725833-34369-170982725924458 <<< 34350 1726853746.20379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34350 1726853746.20382: stderr chunk (state=3): >>><<< 34350 1726853746.20385: stdout chunk (state=3): >>><<< 34350 1726853746.20396: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853746.1725833-34369-170982725924458=/root/.ansible/tmp/ansible-tmp-1726853746.1725833-34369-170982725924458 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34350 1726853746.20424: variable 'ansible_module_compression' from source: unknown 34350 1726853746.20464: ANSIBALLZ: Using generic lock for ansible.legacy.setup 34350 1726853746.20474: ANSIBALLZ: Acquiring lock 34350 1726853746.20476: ANSIBALLZ: Lock acquired: 140478520503008 34350 1726853746.20482: ANSIBALLZ: Creating module 34350 1726853746.54880: ANSIBALLZ: Writing module into payload 34350 1726853746.54927: ANSIBALLZ: Writing module 34350 1726853746.54999: ANSIBALLZ: Renaming module 34350 1726853746.55276: ANSIBALLZ: Done creating module 34350 1726853746.55279: variable 'ansible_facts' from source: unknown 34350 1726853746.55281: variable 'inventory_hostname' from source: host vars for 'managed_node1' 34350 1726853746.55283: _low_level_execute_command(): starting 34350 1726853746.55286: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 34350 1726853746.56470: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34350 1726853746.56487: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853746.56584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853746.56643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 34350 1726853746.56653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34350 1726853746.56685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34350 1726853746.56790: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34350 1726853746.58451: stdout chunk (state=3): >>>PLATFORM <<< 34350 1726853746.58719: stdout chunk (state=3): >>>Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 34350 1726853746.58722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34350 1726853746.58751: stderr chunk (state=3): >>><<< 34350 1726853746.58759: stdout chunk (state=3): >>><<< 34350 1726853746.58784: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34350 1726853746.58800 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 34350 1726853746.58852: _low_level_execute_command(): starting 34350 1726853746.58862: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 34350 1726853746.59091: Sending initial data 34350 1726853746.59103: Sent initial data (1181 bytes) 34350 1726853746.60164: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34350 1726853746.60179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853746.60191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 34350 1726853746.60281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 34350 1726853746.60319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 34350 1726853746.63764: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 34350 1726853746.64189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34350 1726853746.64204: stdout chunk (state=3): >>><<< 34350 1726853746.64217: stderr chunk (state=3): >>><<< 34350 1726853746.64239: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34350 1726853746.64586: variable 'ansible_facts' from source: unknown 34350 1726853746.64590: variable 'ansible_facts' from source: unknown 34350 1726853746.64592: variable 'ansible_module_compression' from source: unknown 34350 1726853746.64594: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34350pt_rq5b8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34350 1726853746.64620: variable 'ansible_facts' from source: unknown 34350 1726853746.65039: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853746.1725833-34369-170982725924458/AnsiballZ_setup.py 34350 1726853746.65630: Sending initial data 34350 1726853746.65634: Sent initial data (154 bytes) 34350 1726853746.66588: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853746.66651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 34350 1726853746.66891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34350 1726853746.66915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34350 1726853746.66986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34350 1726853746.68621: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 34350 1726853746.68645: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34350 1726853746.68675: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34350 1726853746.68775: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34350pt_rq5b8/tmpjklosk6b /root/.ansible/tmp/ansible-tmp-1726853746.1725833-34369-170982725924458/AnsiballZ_setup.py <<< 34350 1726853746.68785: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853746.1725833-34369-170982725924458/AnsiballZ_setup.py" <<< 34350 1726853746.68809: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34350pt_rq5b8/tmpjklosk6b" to remote "/root/.ansible/tmp/ansible-tmp-1726853746.1725833-34369-170982725924458/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853746.1725833-34369-170982725924458/AnsiballZ_setup.py" <<< 34350 1726853746.71413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34350 1726853746.71465: stderr chunk (state=3): >>><<< 34350 1726853746.71516: stdout chunk (state=3): >>><<< 34350 1726853746.71542: done transferring module to remote 34350 1726853746.71679: _low_level_execute_command(): starting 34350 1726853746.71683: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853746.1725833-34369-170982725924458/ /root/.ansible/tmp/ansible-tmp-1726853746.1725833-34369-170982725924458/AnsiballZ_setup.py && sleep 0' 34350 1726853746.72785: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34350 1726853746.72886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34350 1726853746.72898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853746.72909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853746.73046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 34350 1726853746.75004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34350 1726853746.75033: stderr chunk (state=3): >>><<< 34350 1726853746.75043: stdout chunk (state=3): >>><<< 34350 1726853746.75223: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34350 1726853746.75226: _low_level_execute_command(): starting 34350 1726853746.75229: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853746.1725833-34369-170982725924458/AnsiballZ_setup.py && sleep 0' 34350 1726853746.76331: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34350 1726853746.76451: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34350 1726853746.76557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 34350 1726853746.76580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34350 1726853746.76595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34350 1726853746.76733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34350 1726853746.78893: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34350 1726853746.78956: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 34350 1726853746.79043: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 34350 1726853746.79063: stdout chunk (state=3): >>>import 'posix' # <<< 34350 1726853746.79093: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 34350 1726853746.79185: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 34350 1726853746.79188: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853746.79292: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78724e84d0> <<< 34350 1726853746.79395: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78724b7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78724eaa50> import '_signal' # import '_abc' # import 'abc' # <<< 34350 1726853746.79418: stdout chunk (state=3): >>>import 'io' # <<< 34350 1726853746.79718: stdout chunk (state=3): >>>import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # <<< 34350 1726853746.79722: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872299130> <<< 34350 1726853746.79762: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 34350 1726853746.79929: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872299fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34350 1726853746.80212: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 34350 1726853746.80490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78722d7e00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78722d7ec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 34350 1726853746.80607: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 34350 1726853746.80610: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 34350 1726853746.80615: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787230f830> <<< 34350 1726853746.80724: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787230fec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78722efad0> import '_functools' # <<< 34350 1726853746.81094: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78722ed1f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78722d4fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787232f740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787232e360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78722ee0c0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787232cbf0> <<< 34350 1726853746.81117: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 34350 1726853746.81120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78723647d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78722d4230> <<< 34350 1726853746.81195: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872364c80> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872364b30> <<< 34350 1726853746.81393: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872364f20> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78722d2d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872365610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78723652e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872366510> <<< 34350 1726853746.81594: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787237c6e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787237ddf0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 34350 1726853746.81598: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 34350 1726853746.81626: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 34350 1726853746.81642: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787237ec90> <<< 34350 1726853746.81687: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853746.81855: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787237f2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787237e1e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787237fd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787237f4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78723664b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 34350 1726853746.81959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787207fc50> <<< 34350 1726853746.81980: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 34350 1726853746.81992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 34350 1726853746.82180: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78720a86b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78720a8410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78720a86e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 34350 1726853746.82242: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853746.82287: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78720a9010> <<< 34350 1726853746.82486: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78720a99d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78720a88c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787207de20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 34350 1726853746.82614: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78720aadb0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78720a9af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872366c00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34350 1726853746.82831: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78720d7110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 34350 1726853746.82905: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78720f7440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 34350 1726853746.82949: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34350 1726853746.83086: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872158260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 34350 1726853746.83155: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 34350 1726853746.83376: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787215a9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872158380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872125250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f613d0> <<< 34350 1726853746.83485: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78720f6270> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78720abce0> <<< 34350 1726853746.83787: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f78720f6360> <<< 34350 1726853746.83866: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_0vptzod5/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 34350 1726853746.84114: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34350 1726853746.84222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871fc70b0> import '_typing' # <<< 34350 1726853746.84358: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871fa5fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871fa5130> <<< 34350 1726853746.84486: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # # zipimport: zlib available <<< 34350 1726853746.84683: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 34350 1726853746.86187: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.87028: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871fc4f80> <<< 34350 1726853746.87063: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853746.87091: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 34350 1726853746.87103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 34350 1726853746.87123: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 34350 1726853746.87153: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853746.87167: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871ff6a20> <<< 34350 1726853746.87197: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871ff67b0> <<< 34350 1726853746.87226: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871ff60f0> <<< 34350 1726853746.87244: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 34350 1726853746.87393: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871ff6510> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871fc7d40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871ff77d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871ff7a10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 34350 1726853746.87442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 34350 1726853746.87455: stdout chunk (state=3): >>>import '_locale' # <<< 34350 1726853746.87497: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871ff7f50> <<< 34350 1726853746.87514: stdout chunk (state=3): >>>import 'pwd' # <<< 34350 1726853746.87525: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 34350 1726853746.87724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 34350 1726853746.87729: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871929c10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787192b830> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787192c230> <<< 34350 1726853746.87736: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 34350 1726853746.87746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 34350 1726853746.87762: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787192d100> <<< 34350 1726853746.87782: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 34350 1726853746.87996: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787192fe00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78721581d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787192e0c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 34350 1726853746.88025: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 34350 1726853746.88049: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 34350 1726853746.88165: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 34350 1726853746.88278: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 34350 1726853746.88281: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871937bf0> <<< 34350 1726853746.88283: stdout chunk (state=3): >>>import '_tokenize' # <<< 34350 1726853746.88286: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78719366c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871936420> <<< 34350 1726853746.88494: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871936990> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787192e5d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787197be60> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787197bef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 34350 1726853746.88510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 34350 1726853746.88536: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 34350 1726853746.88575: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853746.88592: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787197da60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787197d820> <<< 34350 1726853746.88601: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34350 1726853746.88631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 34350 1726853746.88683: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853746.88701: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787197ffe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787197e150> <<< 34350 1726853746.88711: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34350 1726853746.88751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853746.88776: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 34350 1726853746.88793: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 34350 1726853746.88838: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871983830> <<< 34350 1726853746.88963: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871980200> <<< 34350 1726853746.89057: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78719845c0> <<< 34350 1726853746.89061: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871984830> <<< 34350 1726853746.89591: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871984ad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787197c0e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78718101d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78718116a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871986960> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871987d10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78719865a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853746.89668: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.89688: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 34350 1726853746.89712: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.89727: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 34350 1726853746.89852: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.89970: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.90542: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.91197: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871815820> <<< 34350 1726853746.91241: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 34350 1726853746.91263: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871816570> <<< 34350 1726853746.91313: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871fc7020> <<< 34350 1726853746.91390: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 34350 1726853746.91537: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.92149: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78718165d0> # zipimport: zlib available <<< 34350 1726853746.92162: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.92675: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.92679: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.93031: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853746.93035: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 34350 1726853746.93038: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.93041: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.93043: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 34350 1726853746.93045: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.93078: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.93113: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 34350 1726853746.93127: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.93377: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.93577: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 34350 1726853746.93632: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 34350 1726853746.93646: stdout chunk (state=3): >>>import '_ast' # <<< 34350 1726853746.93713: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871817860> <<< 34350 1726853746.93735: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.93822: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.93873: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 34350 1726853746.93881: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 34350 1726853746.93982: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # <<< 34350 1726853746.93993: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.94038: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.94075: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.94144: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.94191: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 34350 1726853746.94253: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853746.94461: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78718220f0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787181fe30> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 34350 1726853746.94469: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.94473: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.94514: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.94541: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.94592: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853746.94786: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 34350 1726853746.94790: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 34350 1726853746.94792: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 34350 1726853746.94795: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 34350 1726853746.94797: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787190a8a0> <<< 34350 1726853746.94924: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78719fe5a0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871815370> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787192fd40> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 34350 1726853746.94956: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.94981: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 34350 1726853746.94994: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 34350 1726853746.95221: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 34350 1726853746.95226: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853746.95229: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available <<< 34350 1726853746.95235: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.95237: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.95239: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.95241: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.95281: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.95321: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.95356: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.95396: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 34350 1726853746.95402: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.95656: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.95662: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.95665: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.95667: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 34350 1726853746.95669: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.95880: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.95947: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.95987: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.96041: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853746.96087: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 34350 1726853746.96150: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78718b6270> <<< 34350 1726853746.96478: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78714a7f20> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78714bc530> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78718a6fc0> <<< 34350 1726853746.96481: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78718b6de0> <<< 34350 1726853746.96484: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78718b4950> <<< 34350 1726853746.96486: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78718b4500> <<< 34350 1726853746.96489: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 34350 1726853746.96768: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78714bf170> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78714bea50> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78714bec30> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78714bde80> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 34350 1726853746.96797: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78714bf2f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 34350 1726853746.96828: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 34350 1726853746.96858: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871511d90> <<< 34350 1726853746.96885: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78714bfd70> <<< 34350 1726853746.97195: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78718b4620> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # <<< 34350 1726853746.97218: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 34350 1726853746.97453: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853746.97456: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 34350 1726853746.97459: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.97489: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 34350 1726853746.97590: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853746.97613: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.97670: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.97720: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 34350 1726853746.97738: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 34350 1726853746.98210: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.98692: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853746.98745: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.98862: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.98865: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 34350 1726853746.98993: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853746.99004: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 34350 1726853746.99175: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853746.99179: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853746.99276: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 34350 1726853746.99283: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.99285: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.99372: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 34350 1726853746.99390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 34350 1726853746.99405: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871513e30> <<< 34350 1726853746.99677: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 34350 1726853746.99681: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871512960> import 'ansible.module_utils.facts.system.local' # <<< 34350 1726853746.99683: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.99685: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.99705: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 34350 1726853746.99719: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853746.99984: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.00000: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853747.00027: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 34350 1726853747.00041: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.00229: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.00233: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 34350 1726853747.00236: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 34350 1726853747.00238: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853747.00292: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853747.00306: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787154a120> <<< 34350 1726853747.00482: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871539f10> import 'ansible.module_utils.facts.system.python' # <<< 34350 1726853747.00492: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.00544: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.00597: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 34350 1726853747.00610: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.00949: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853747.01020: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 34350 1726853747.01036: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.01179: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.01195: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853747.01404: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787155dc70> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787155dd00> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 34350 1726853747.01545: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.01910: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853747.02090: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853747.02174: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.02492: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853747.02557: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 34350 1726853747.02574: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.02617: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.02690: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.03475: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.03776: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 34350 1726853747.03779: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.03812: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.04095: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853747.04292: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853747.04423: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 34350 1726853747.04449: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 34350 1726853747.04467: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.04623: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.04691: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853747.04747: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.04949: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.05144: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 34350 1726853747.05160: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 34350 1726853747.05190: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.05376: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 34350 1726853747.05379: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.05382: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.05384: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 34350 1726853747.05387: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.05388: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.05424: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 34350 1726853747.05720: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # <<< 34350 1726853747.05724: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.05726: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.05729: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 34350 1726853747.05731: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.05986: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.06239: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 34350 1726853747.06254: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.06495: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853747.06511: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 34350 1726853747.06525: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.06551: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.06582: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 34350 1726853747.06810: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.06813: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.06815: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 34350 1726853747.06817: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853747.06820: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 34350 1726853747.06821: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.06828: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.07091: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853747.07150: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 34350 1726853747.07172: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 34350 1726853747.07222: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.07268: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 34350 1726853747.07284: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.07575: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.07654: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 34350 1726853747.07668: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.07707: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.07751: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 34350 1726853747.07766: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.07804: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.08005: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 34350 1726853747.08009: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.08019: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 34350 1726853747.08034: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 34350 1726853747.08115: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.08291: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available <<< 34350 1726853747.08468: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 34350 1726853747.08492: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 34350 1726853747.08506: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 34350 1726853747.08536: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853747.08552: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78712fa840> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78712fac60> <<< 34350 1726853747.08876: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78712fbe90> <<< 34350 1726853747.21583: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 34350 1726853747.21607: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 34350 1726853747.21634: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871340b00> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 34350 1726853747.21882: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871340e90> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871342270> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871341d30> <<< 34350 1726853747.22045: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 34350 1726853747.46216: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.57666015625, "5m": 0.4482421875, "15m": 0.259765625}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2938, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 593, "free": 2938}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 913, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261771280384, "block_size": 4096, "block_total": 65519099, "block_available": 63909004, "block_used": 1610095, "inode_total": 131070960, "inode_available": 131028926, "inode_used": 42034, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "35", "second": "47", "epoch": "1726853747", "epoch_int": "1726853747", "date": "2024-09-20", "time": "13:35:47", "iso8601_micro": "2024-09-20T17:35:47.457241Z", "iso8601": "2024-09-20T17:35:47Z", "iso8601_basic": "20240920T133547457241", "iso8601_basic_short": "20240920T133547", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34350 1726853747.46685: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type <<< 34350 1726853747.46702: stdout chunk (state=3): >>># clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat <<< 34350 1726853747.46716: stdout chunk (state=3): >>># cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings<<< 34350 1726853747.46749: stdout chunk (state=3): >>> # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp <<< 34350 1726853747.46763: stdout chunk (state=3): >>># cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex <<< 34350 1726853747.46791: stdout chunk (state=3): >>># cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common <<< 34350 1726853747.46986: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux <<< 34350 1726853747.47017: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 34350 1726853747.47277: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 34350 1726853747.47280: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 34350 1726853747.47332: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 34350 1726853747.47593: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil <<< 34350 1726853747.47625: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl <<< 34350 1726853747.47647: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 34350 1726853747.47686: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing <<< 34350 1726853747.47709: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 34350 1726853747.47782: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser <<< 34350 1726853747.47796: stdout chunk (state=3): >>># cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 34350 1726853747.48190: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 34350 1726853747.48210: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34350 1726853747.48310: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8<<< 34350 1726853747.48352: stdout chunk (state=3): >>> # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random <<< 34350 1726853747.48398: stdout chunk (state=3): >>># destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools <<< 34350 1726853747.48412: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 34350 1726853747.48746: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 34350 1726853747.49224: stderr chunk (state=3): >>><<< 34350 1726853747.49228: stdout chunk (state=3): >>><<< 34350 1726853747.49699: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78724e84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78724b7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78724eaa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872299130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872299fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78722d7e00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78722d7ec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787230f830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787230fec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78722efad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78722ed1f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78722d4fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787232f740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787232e360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78722ee0c0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787232cbf0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78723647d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78722d4230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872364c80> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872364b30> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7872364f20> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78722d2d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872365610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78723652e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872366510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787237c6e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787237ddf0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787237ec90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787237f2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787237e1e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787237fd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787237f4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78723664b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787207fc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78720a86b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78720a8410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78720a86e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78720a9010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78720a99d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78720a88c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787207de20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78720aadb0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78720a9af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872366c00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78720d7110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78720f7440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872158260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787215a9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872158380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7872125250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871f613d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78720f6270> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78720abce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f78720f6360> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_0vptzod5/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871fc70b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871fa5fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871fa5130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871fc4f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871ff6a20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871ff67b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871ff60f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871ff6510> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871fc7d40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871ff77d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871ff7a10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871ff7f50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871929c10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787192b830> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787192c230> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787192d100> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787192fe00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78721581d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787192e0c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871937bf0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78719366c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871936420> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871936990> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787192e5d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787197be60> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787197bef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787197da60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787197d820> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787197ffe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787197e150> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871983830> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871980200> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78719845c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871984830> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871984ad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787197c0e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78718101d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78718116a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871986960> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871987d10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78719865a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871815820> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871816570> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871fc7020> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78718165d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871817860> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78718220f0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787181fe30> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787190a8a0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78719fe5a0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871815370> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787192fd40> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78718b6270> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78714a7f20> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78714bc530> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78718a6fc0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78718b6de0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78718b4950> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78718b4500> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78714bf170> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78714bea50> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78714bec30> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78714bde80> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78714bf2f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7871511d90> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78714bfd70> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78718b4620> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871513e30> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871512960> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787154a120> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871539f10> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f787155dc70> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f787155dd00> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78712fa840> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78712fac60> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78712fbe90> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871340b00> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871340e90> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871342270> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7871341d30> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.57666015625, "5m": 0.4482421875, "15m": 0.259765625}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2938, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 593, "free": 2938}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 913, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261771280384, "block_size": 4096, "block_total": 65519099, "block_available": 63909004, "block_used": 1610095, "inode_total": 131070960, "inode_available": 131028926, "inode_used": 42034, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "35", "second": "47", "epoch": "1726853747", "epoch_int": "1726853747", "date": "2024-09-20", "time": "13:35:47", "iso8601_micro": "2024-09-20T17:35:47.457241Z", "iso8601": "2024-09-20T17:35:47Z", "iso8601_basic": "20240920T133547457241", "iso8601_basic_short": "20240920T133547", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 34350 1726853747.51514: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853746.1725833-34369-170982725924458/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34350 1726853747.51518: _low_level_execute_command(): starting 34350 1726853747.51520: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853746.1725833-34369-170982725924458/ > /dev/null 2>&1 && sleep 0' 34350 1726853747.52459: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34350 1726853747.52463: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34350 1726853747.52465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34350 1726853747.52467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34350 1726853747.52468: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 34350 1726853747.52485: stderr chunk (state=3): >>>debug2: match not found <<< 34350 1726853747.52581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 34350 1726853747.52798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34350 1726853747.52873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34350 1726853747.54745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34350 1726853747.54757: stdout chunk (state=3): >>><<< 34350 1726853747.54770: stderr chunk (state=3): >>><<< 34350 1726853747.54793: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34350 1726853747.55076: handler run complete 34350 1726853747.55079: variable 'ansible_facts' from source: unknown 34350 1726853747.55298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853747.55783: variable 'ansible_facts' from source: unknown 34350 1726853747.55951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853747.56290: attempt loop complete, returning result 34350 1726853747.56299: _execute() done 34350 1726853747.56306: dumping result to json 34350 1726853747.56337: done dumping result, returning 34350 1726853747.56349: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-b6c1-0de4-000000000147] 34350 1726853747.56357: sending task result for task 02083763-bbaf-b6c1-0de4-000000000147 34350 1726853747.57112: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000147 34350 1726853747.57116: WORKER PROCESS EXITING ok: [managed_node1] 34350 1726853747.57719: no more pending results, returning what we have 34350 1726853747.57722: results queue empty 34350 1726853747.57723: checking for any_errors_fatal 34350 1726853747.57725: done checking for any_errors_fatal 34350 1726853747.57725: checking for max_fail_percentage 34350 1726853747.57727: done checking for max_fail_percentage 34350 1726853747.57728: checking to see if all hosts have failed and the running result is not ok 34350 1726853747.57728: done checking to see if all hosts have failed 34350 1726853747.57729: getting the remaining hosts for this loop 34350 1726853747.57731: done getting the remaining hosts for this loop 34350 1726853747.57734: getting the next task for host managed_node1 34350 1726853747.57741: done getting next task for host managed_node1 34350 1726853747.57743: ^ task is: TASK: meta (flush_handlers) 34350 1726853747.57744: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853747.57749: getting variables 34350 1726853747.57750: in VariableManager get_vars() 34350 1726853747.57887: Calling all_inventory to load vars for managed_node1 34350 1726853747.57891: Calling groups_inventory to load vars for managed_node1 34350 1726853747.57894: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853747.57904: Calling all_plugins_play to load vars for managed_node1 34350 1726853747.57907: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853747.57909: Calling groups_plugins_play to load vars for managed_node1 34350 1726853747.58267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853747.58569: done with get_vars() 34350 1726853747.58697: done getting variables 34350 1726853747.58764: in VariableManager get_vars() 34350 1726853747.58775: Calling all_inventory to load vars for managed_node1 34350 1726853747.58778: Calling groups_inventory to load vars for managed_node1 34350 1726853747.58780: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853747.58784: Calling all_plugins_play to load vars for managed_node1 34350 1726853747.58786: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853747.58907: Calling groups_plugins_play to load vars for managed_node1 34350 1726853747.59186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853747.59611: done with get_vars() 34350 1726853747.59625: done queuing things up, now waiting for results queue to drain 34350 1726853747.59628: results queue empty 34350 1726853747.59628: checking for any_errors_fatal 34350 1726853747.59631: done checking for any_errors_fatal 34350 1726853747.59636: checking for max_fail_percentage 34350 1726853747.59637: done checking for max_fail_percentage 34350 1726853747.59638: checking to see if all hosts have failed and the running result is not ok 34350 1726853747.59639: done checking to see if all hosts have failed 34350 1726853747.59640: getting the remaining hosts for this loop 34350 1726853747.59641: done getting the remaining hosts for this loop 34350 1726853747.59643: getting the next task for host managed_node1 34350 1726853747.59648: done getting next task for host managed_node1 34350 1726853747.59651: ^ task is: TASK: Include the task 'el_repo_setup.yml' 34350 1726853747.59652: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853747.59654: getting variables 34350 1726853747.59655: in VariableManager get_vars() 34350 1726853747.59779: Calling all_inventory to load vars for managed_node1 34350 1726853747.59782: Calling groups_inventory to load vars for managed_node1 34350 1726853747.59784: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853747.59787: Calling all_plugins_play to load vars for managed_node1 34350 1726853747.59789: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853747.59791: Calling groups_plugins_play to load vars for managed_node1 34350 1726853747.60145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853747.60484: done with get_vars() 34350 1726853747.60493: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:11 Friday 20 September 2024 13:35:47 -0400 (0:00:01.471) 0:00:01.478 ****** 34350 1726853747.60581: entering _queue_task() for managed_node1/include_tasks 34350 1726853747.60583: Creating lock for include_tasks 34350 1726853747.60929: worker is 1 (out of 1 available) 34350 1726853747.60942: exiting _queue_task() for managed_node1/include_tasks 34350 1726853747.60956: done queuing things up, now waiting for results queue to drain 34350 1726853747.60957: waiting for pending results... 34350 1726853747.61234: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 34350 1726853747.61343: in run() - task 02083763-bbaf-b6c1-0de4-000000000006 34350 1726853747.61363: variable 'ansible_search_path' from source: unknown 34350 1726853747.61411: calling self._execute() 34350 1726853747.61484: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853747.61495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853747.61517: variable 'omit' from source: magic vars 34350 1726853747.61625: _execute() done 34350 1726853747.61634: dumping result to json 34350 1726853747.61643: done dumping result, returning 34350 1726853747.61656: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [02083763-bbaf-b6c1-0de4-000000000006] 34350 1726853747.61666: sending task result for task 02083763-bbaf-b6c1-0de4-000000000006 34350 1726853747.61877: no more pending results, returning what we have 34350 1726853747.61883: in VariableManager get_vars() 34350 1726853747.61917: Calling all_inventory to load vars for managed_node1 34350 1726853747.61920: Calling groups_inventory to load vars for managed_node1 34350 1726853747.61922: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853747.61942: Calling all_plugins_play to load vars for managed_node1 34350 1726853747.61946: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853747.61949: Calling groups_plugins_play to load vars for managed_node1 34350 1726853747.62374: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000006 34350 1726853747.62377: WORKER PROCESS EXITING 34350 1726853747.62384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853747.62565: done with get_vars() 34350 1726853747.62577: variable 'ansible_search_path' from source: unknown 34350 1726853747.62592: we have included files to process 34350 1726853747.62593: generating all_blocks data 34350 1726853747.62594: done generating all_blocks data 34350 1726853747.62595: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34350 1726853747.62596: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34350 1726853747.62599: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 34350 1726853747.63248: in VariableManager get_vars() 34350 1726853747.63262: done with get_vars() 34350 1726853747.63277: done processing included file 34350 1726853747.63280: iterating over new_blocks loaded from include file 34350 1726853747.63281: in VariableManager get_vars() 34350 1726853747.63291: done with get_vars() 34350 1726853747.63292: filtering new block on tags 34350 1726853747.63306: done filtering new block on tags 34350 1726853747.63309: in VariableManager get_vars() 34350 1726853747.63319: done with get_vars() 34350 1726853747.63320: filtering new block on tags 34350 1726853747.63340: done filtering new block on tags 34350 1726853747.63343: in VariableManager get_vars() 34350 1726853747.63353: done with get_vars() 34350 1726853747.63355: filtering new block on tags 34350 1726853747.63368: done filtering new block on tags 34350 1726853747.63369: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 34350 1726853747.63376: extending task lists for all hosts with included blocks 34350 1726853747.63422: done extending task lists 34350 1726853747.63423: done processing included files 34350 1726853747.63424: results queue empty 34350 1726853747.63425: checking for any_errors_fatal 34350 1726853747.63426: done checking for any_errors_fatal 34350 1726853747.63427: checking for max_fail_percentage 34350 1726853747.63428: done checking for max_fail_percentage 34350 1726853747.63428: checking to see if all hosts have failed and the running result is not ok 34350 1726853747.63429: done checking to see if all hosts have failed 34350 1726853747.63430: getting the remaining hosts for this loop 34350 1726853747.63431: done getting the remaining hosts for this loop 34350 1726853747.63433: getting the next task for host managed_node1 34350 1726853747.63438: done getting next task for host managed_node1 34350 1726853747.63440: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 34350 1726853747.63447: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853747.63449: getting variables 34350 1726853747.63450: in VariableManager get_vars() 34350 1726853747.63458: Calling all_inventory to load vars for managed_node1 34350 1726853747.63460: Calling groups_inventory to load vars for managed_node1 34350 1726853747.63462: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853747.63467: Calling all_plugins_play to load vars for managed_node1 34350 1726853747.63469: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853747.63474: Calling groups_plugins_play to load vars for managed_node1 34350 1726853747.63629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853747.63820: done with get_vars() 34350 1726853747.63829: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 13:35:47 -0400 (0:00:00.033) 0:00:01.511 ****** 34350 1726853747.63898: entering _queue_task() for managed_node1/setup 34350 1726853747.64230: worker is 1 (out of 1 available) 34350 1726853747.64242: exiting _queue_task() for managed_node1/setup 34350 1726853747.64254: done queuing things up, now waiting for results queue to drain 34350 1726853747.64255: waiting for pending results... 34350 1726853747.64487: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 34350 1726853747.64602: in run() - task 02083763-bbaf-b6c1-0de4-000000000158 34350 1726853747.64618: variable 'ansible_search_path' from source: unknown 34350 1726853747.64624: variable 'ansible_search_path' from source: unknown 34350 1726853747.64669: calling self._execute() 34350 1726853747.64742: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853747.64764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853747.64781: variable 'omit' from source: magic vars 34350 1726853747.65321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34350 1726853747.67415: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34350 1726853747.67491: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34350 1726853747.67532: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34350 1726853747.67573: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34350 1726853747.67680: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34350 1726853747.67687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34350 1726853747.67722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34350 1726853747.67751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34350 1726853747.67799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34350 1726853747.67817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34350 1726853747.67994: variable 'ansible_facts' from source: unknown 34350 1726853747.68065: variable 'network_test_required_facts' from source: task vars 34350 1726853747.68115: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 34350 1726853747.68126: variable 'omit' from source: magic vars 34350 1726853747.68165: variable 'omit' from source: magic vars 34350 1726853747.68202: variable 'omit' from source: magic vars 34350 1726853747.68236: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34350 1726853747.68265: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34350 1726853747.68291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34350 1726853747.68312: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34350 1726853747.68333: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34350 1726853747.68438: variable 'inventory_hostname' from source: host vars for 'managed_node1' 34350 1726853747.68441: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853747.68443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853747.68478: Set connection var ansible_timeout to 10 34350 1726853747.68490: Set connection var ansible_module_compression to ZIP_DEFLATED 34350 1726853747.68502: Set connection var ansible_pipelining to False 34350 1726853747.68512: Set connection var ansible_shell_executable to /bin/sh 34350 1726853747.68523: Set connection var ansible_connection to ssh 34350 1726853747.68528: Set connection var ansible_shell_type to sh 34350 1726853747.68562: variable 'ansible_shell_executable' from source: unknown 34350 1726853747.68570: variable 'ansible_connection' from source: unknown 34350 1726853747.68579: variable 'ansible_module_compression' from source: unknown 34350 1726853747.68655: variable 'ansible_shell_type' from source: unknown 34350 1726853747.68660: variable 'ansible_shell_executable' from source: unknown 34350 1726853747.68662: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853747.68664: variable 'ansible_pipelining' from source: unknown 34350 1726853747.68667: variable 'ansible_timeout' from source: unknown 34350 1726853747.68669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853747.68752: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34350 1726853747.68773: variable 'omit' from source: magic vars 34350 1726853747.68788: starting attempt loop 34350 1726853747.68877: running the handler 34350 1726853747.68880: _low_level_execute_command(): starting 34350 1726853747.68882: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34350 1726853747.69514: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34350 1726853747.69557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853747.69575: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34350 1726853747.69663: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 34350 1726853747.69687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34350 1726853747.69712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34350 1726853747.70001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34350 1726853747.71680: stdout chunk (state=3): >>>/root <<< 34350 1726853747.71978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34350 1726853747.71981: stdout chunk (state=3): >>><<< 34350 1726853747.71983: stderr chunk (state=3): >>><<< 34350 1726853747.71987: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34350 1726853747.71996: _low_level_execute_command(): starting 34350 1726853747.71999: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853747.7185016-34421-123272660633076 `" && echo ansible-tmp-1726853747.7185016-34421-123272660633076="` echo /root/.ansible/tmp/ansible-tmp-1726853747.7185016-34421-123272660633076 `" ) && sleep 0' 34350 1726853747.73330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34350 1726853747.73334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34350 1726853747.73349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 34350 1726853747.73359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853747.73369: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 34350 1726853747.73381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34350 1726853747.73395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853747.73586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 34350 1726853747.76153: stdout chunk (state=3): >>>ansible-tmp-1726853747.7185016-34421-123272660633076=/root/.ansible/tmp/ansible-tmp-1726853747.7185016-34421-123272660633076 <<< 34350 1726853747.76326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34350 1726853747.76377: stderr chunk (state=3): >>><<< 34350 1726853747.76380: stdout chunk (state=3): >>><<< 34350 1726853747.76576: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853747.7185016-34421-123272660633076=/root/.ansible/tmp/ansible-tmp-1726853747.7185016-34421-123272660633076 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34350 1726853747.76579: variable 'ansible_module_compression' from source: unknown 34350 1726853747.76581: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-34350pt_rq5b8/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 34350 1726853747.76583: variable 'ansible_facts' from source: unknown 34350 1726853747.76769: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853747.7185016-34421-123272660633076/AnsiballZ_setup.py 34350 1726853747.77322: Sending initial data 34350 1726853747.77325: Sent initial data (154 bytes) 34350 1726853747.78249: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34350 1726853747.78264: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34350 1726853747.78267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34350 1726853747.78294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34350 1726853747.78369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 34350 1726853747.78375: stderr chunk (state=3): >>>debug2: match not found <<< 34350 1726853747.78383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853747.78387: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 34350 1726853747.78414: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 34350 1726853747.78427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34350 1726853747.78506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34350 1726853747.80697: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34350 1726853747.80738: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34350 1726853747.80784: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34350pt_rq5b8/tmpe86kcuvw /root/.ansible/tmp/ansible-tmp-1726853747.7185016-34421-123272660633076/AnsiballZ_setup.py <<< 34350 1726853747.80788: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853747.7185016-34421-123272660633076/AnsiballZ_setup.py" <<< 34350 1726853747.80827: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34350pt_rq5b8/tmpe86kcuvw" to remote "/root/.ansible/tmp/ansible-tmp-1726853747.7185016-34421-123272660633076/AnsiballZ_setup.py" <<< 34350 1726853747.80830: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853747.7185016-34421-123272660633076/AnsiballZ_setup.py" <<< 34350 1726853747.82139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34350 1726853747.82313: stderr chunk (state=3): >>><<< 34350 1726853747.82322: stdout chunk (state=3): >>><<< 34350 1726853747.82324: done transferring module to remote 34350 1726853747.82330: _low_level_execute_command(): starting 34350 1726853747.82333: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853747.7185016-34421-123272660633076/ /root/.ansible/tmp/ansible-tmp-1726853747.7185016-34421-123272660633076/AnsiballZ_setup.py && sleep 0' 34350 1726853747.83331: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34350 1726853747.83440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853747.83496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 34350 1726853747.83662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34350 1726853747.83766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34350 1726853747.83833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34350 1726853747.85687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34350 1726853747.85712: stderr chunk (state=3): >>><<< 34350 1726853747.85721: stdout chunk (state=3): >>><<< 34350 1726853747.85744: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34350 1726853747.85747: _low_level_execute_command(): starting 34350 1726853747.85761: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853747.7185016-34421-123272660633076/AnsiballZ_setup.py && sleep 0' 34350 1726853747.86376: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34350 1726853747.86381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34350 1726853747.86383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34350 1726853747.86386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34350 1726853747.86388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 34350 1726853747.86462: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 34350 1726853747.86466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34350 1726853747.86514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34350 1726853747.88715: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34350 1726853747.88734: stdout chunk (state=3): >>>import _imp # builtin <<< 34350 1726853747.88795: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 34350 1726853747.88866: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 34350 1726853747.88879: stdout chunk (state=3): >>>import 'posix' # <<< 34350 1726853747.88921: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 34350 1726853747.88948: stdout chunk (state=3): >>>import 'time' # <<< 34350 1726853747.88969: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 34350 1726853747.89003: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853747.89029: stdout chunk (state=3): >>>import '_codecs' # <<< 34350 1726853747.89051: stdout chunk (state=3): >>>import 'codecs' # <<< 34350 1726853747.89091: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 34350 1726853747.89114: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 34350 1726853747.89139: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a5684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a537b30> <<< 34350 1726853747.89164: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 34350 1726853747.89209: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a56aa50> <<< 34350 1726853747.89232: stdout chunk (state=3): >>>import '_signal' # import '_abc' # <<< 34350 1726853747.89245: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 34350 1726853747.89282: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 34350 1726853747.89369: stdout chunk (state=3): >>>import '_collections_abc' # <<< 34350 1726853747.89399: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 34350 1726853747.89437: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 34350 1726853747.89466: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 34350 1726853747.89490: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 34350 1726853747.89529: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 34350 1726853747.89554: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a31d130> <<< 34350 1726853747.89612: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 34350 1726853747.89631: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a31dfa0> <<< 34350 1726853747.89662: stdout chunk (state=3): >>>import 'site' # <<< 34350 1726853747.89685: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 34350 1726853747.90068: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 34350 1726853747.90105: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853747.90129: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 34350 1726853747.90178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 34350 1726853747.90198: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 34350 1726853747.90237: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 34350 1726853747.90240: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a35bec0> <<< 34350 1726853747.90267: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 34350 1726853747.90298: stdout chunk (state=3): >>>import '_operator' # <<< 34350 1726853747.90319: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a35bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 34350 1726853747.90342: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 34350 1726853747.90366: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 34350 1726853747.90427: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853747.90440: stdout chunk (state=3): >>>import 'itertools' # <<< 34350 1726853747.90479: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a393830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 34350 1726853747.90512: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a393ec0> <<< 34350 1726853747.90515: stdout chunk (state=3): >>>import '_collections' # <<< 34350 1726853747.90576: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a373b60> <<< 34350 1726853747.90583: stdout chunk (state=3): >>>import '_functools' # <<< 34350 1726853747.90607: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3712b0> <<< 34350 1726853747.90699: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a359070> <<< 34350 1726853747.90735: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 34350 1726853747.90738: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 34350 1726853747.90775: stdout chunk (state=3): >>>import '_sre' # <<< 34350 1726853747.90786: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 34350 1726853747.90824: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 34350 1726853747.90837: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 34350 1726853747.90878: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3b37d0> <<< 34350 1726853747.90897: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3b23f0> <<< 34350 1726853747.90923: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a372150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3b0bc0> <<< 34350 1726853747.90982: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 34350 1726853747.91013: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3e8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3582f0> <<< 34350 1726853747.91038: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 34350 1726853747.91064: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a3e8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3e8bf0> <<< 34350 1726853747.91105: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853747.91123: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a3e8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a356e10> <<< 34350 1726853747.91150: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853747.91176: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 34350 1726853747.91213: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 34350 1726853747.91259: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3e9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3e9370> import 'importlib.machinery' # <<< 34350 1726853747.91262: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 34350 1726853747.91308: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3ea540> <<< 34350 1726853747.91311: stdout chunk (state=3): >>>import 'importlib.util' # <<< 34350 1726853747.91341: stdout chunk (state=3): >>>import 'runpy' # <<< 34350 1726853747.91358: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 34350 1726853747.91393: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 34350 1726853747.91409: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a400740> <<< 34350 1726853747.91426: stdout chunk (state=3): >>>import 'errno' # <<< 34350 1726853747.91474: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a401e20> <<< 34350 1726853747.91512: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 34350 1726853747.91516: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 34350 1726853747.91541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a402cc0> <<< 34350 1726853747.91590: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853747.91605: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a4032f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a402210> <<< 34350 1726853747.91628: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 34350 1726853747.91681: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853747.91698: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a403d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a4034a0> <<< 34350 1726853747.91736: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3ea4b0> <<< 34350 1726853747.91752: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 34350 1726853747.91813: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 34350 1726853747.91817: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 34350 1726853747.91831: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 34350 1726853747.91860: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a0f3c20> <<< 34350 1726853747.91883: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 34350 1726853747.91923: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a11c6e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a11c440> <<< 34350 1726853747.91952: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a11c710> <<< 34350 1726853747.91987: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 34350 1726853747.91996: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 34350 1726853747.92066: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853747.92202: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a11d040> <<< 34350 1726853747.92326: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a11da30> <<< 34350 1726853747.92356: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a11c8f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a0f1dc0> <<< 34350 1726853747.92382: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 34350 1726853747.92411: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 34350 1726853747.92423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 34350 1726853747.92453: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a11ee40> <<< 34350 1726853747.92485: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a11db80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3ea6f0> <<< 34350 1726853747.92500: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34350 1726853747.92563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853747.92590: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 34350 1726853747.92614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 34350 1726853747.92638: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a1471a0> <<< 34350 1726853747.92709: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 34350 1726853747.92733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 34350 1726853747.92749: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 34350 1726853747.92788: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a16b560> <<< 34350 1726853747.92819: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 34350 1726853747.92849: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34350 1726853747.92893: stdout chunk (state=3): >>>import 'ntpath' # <<< 34350 1726853747.92943: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a1cc2f0> <<< 34350 1726853747.92958: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 34350 1726853747.92975: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 34350 1726853747.93000: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 34350 1726853747.93038: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 34350 1726853747.93118: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a1cea50> <<< 34350 1726853747.93187: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a1cc410> <<< 34350 1726853747.93225: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a1912e0> <<< 34350 1726853747.93260: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19b113d0> <<< 34350 1726853747.93287: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a16a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a11fd70> <<< 34350 1726853747.93450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 34350 1726853747.93460: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7d1a16a6c0> <<< 34350 1726853747.93729: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_q48vb6l8/ansible_setup_payload.zip' # zipimport: zlib available <<< 34350 1726853747.93856: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.93890: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 34350 1726853747.93903: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 34350 1726853747.93944: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34350 1726853747.94204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 34350 1726853747.94206: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19b7b0b0> import '_typing' # <<< 34350 1726853747.94239: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19b59fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19b59100> <<< 34350 1726853747.94249: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.94276: stdout chunk (state=3): >>>import 'ansible' # <<< 34350 1726853747.94321: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853747.94334: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 34350 1726853747.94350: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.97324: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853747.97949: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 34350 1726853747.97981: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 34350 1726853747.97989: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19b78f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853747.98217: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19baa9f0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19baa7b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19baa0c0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 34350 1726853747.98300: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19baa510> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19b7bd40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853747.98307: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19bab740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19bab980> <<< 34350 1726853747.98378: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 34350 1726853747.98392: stdout chunk (state=3): >>>import '_locale' # <<< 34350 1726853747.98450: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19babec0> <<< 34350 1726853747.98458: stdout chunk (state=3): >>>import 'pwd' # <<< 34350 1726853747.98483: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 34350 1726853747.98513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 34350 1726853747.98585: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a19ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853747.98591: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19a1b8c0> <<< 34350 1726853747.98621: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 34350 1726853747.98672: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a1c2c0> <<< 34350 1726853747.98694: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 34350 1726853747.98746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a1d460> <<< 34350 1726853747.98804: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 34350 1726853747.98829: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 34350 1726853747.98907: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a1fef0> <<< 34350 1726853747.98954: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853747.98957: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19a24230> <<< 34350 1726853747.98991: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a1e1b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 34350 1726853747.99047: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 34350 1726853747.99099: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 34350 1726853747.99226: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 34350 1726853747.99252: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 34350 1726853747.99272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a27e90> <<< 34350 1726853747.99285: stdout chunk (state=3): >>>import '_tokenize' # <<< 34350 1726853747.99392: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a26960> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a266c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 34350 1726853747.99395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 34350 1726853747.99533: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a26c30> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a1e6c0> <<< 34350 1726853747.99588: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19a6be90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a6c110> <<< 34350 1726853747.99615: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 34350 1726853747.99649: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 34350 1726853747.99715: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19a6dca0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a6da60> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34350 1726853747.99805: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19a70260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a6e390> <<< 34350 1726853747.99902: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 34350 1726853747.99915: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 34350 1726853747.99926: stdout chunk (state=3): >>>import '_string' # <<< 34350 1726853747.99977: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a739e0> <<< 34350 1726853748.00166: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a703e0> <<< 34350 1726853748.00243: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853748.00250: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19a74800> <<< 34350 1726853748.00297: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19a74ad0> <<< 34350 1726853748.00329: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19a74b00> <<< 34350 1726853748.00368: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a6c410> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 34350 1726853748.00373: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 34350 1726853748.00410: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 34350 1726853748.00443: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853748.00552: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d199042f0> <<< 34350 1726853748.00704: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19905310> <<< 34350 1726853748.00722: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a76a80> <<< 34350 1726853748.00828: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19a77e30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a766c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 34350 1726853748.00922: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.01047: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.01053: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 34350 1726853748.01078: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.01085: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.01104: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 34350 1726853748.01286: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.01466: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.02598: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.03222: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 34350 1726853748.03240: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 34350 1726853748.03247: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 34350 1726853748.03264: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 34350 1726853748.03290: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853748.03343: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19909700> <<< 34350 1726853748.03453: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 34350 1726853748.03481: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1990aab0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d199055b0> <<< 34350 1726853748.03547: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 34350 1726853748.03551: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.03580: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.03585: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 34350 1726853748.03609: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.03824: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.04046: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 34350 1726853748.04067: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1990ac90> <<< 34350 1726853748.04074: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.04811: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.05314: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.05386: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.05473: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 34350 1726853748.05476: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.05508: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.05552: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 34350 1726853748.05577: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.05624: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.05728: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 34350 1726853748.05747: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 34350 1726853748.05801: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.05828: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 34350 1726853748.05854: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.06404: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.06801: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 34350 1726853748.06946: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1990b830> # zipimport: zlib available <<< 34350 1726853748.06995: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.07084: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 34350 1726853748.07095: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 34350 1726853748.07108: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 34350 1726853748.07160: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.07196: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.07240: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 34350 1726853748.07328: stdout chunk (state=3): >>> # zipimport: zlib available <<< 34350 1726853748.07401: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.07444: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.07600: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.07638: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py<<< 34350 1726853748.07643: stdout chunk (state=3): >>> <<< 34350 1726853748.07716: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc'<<< 34350 1726853748.07720: stdout chunk (state=3): >>> <<< 34350 1726853748.07847: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 34350 1726853748.07851: stdout chunk (state=3): >>> <<< 34350 1726853748.07867: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853748.07930: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19916480> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19911a60><<< 34350 1726853748.07936: stdout chunk (state=3): >>> <<< 34350 1726853748.07977: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 34350 1726853748.07984: stdout chunk (state=3): >>> <<< 34350 1726853748.08000: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 34350 1726853748.08021: stdout chunk (state=3): >>># zipimport: zlib available<<< 34350 1726853748.08026: stdout chunk (state=3): >>> <<< 34350 1726853748.08120: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.08254: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853748.08317: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 34350 1726853748.08343: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853748.08368: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 34350 1726853748.08398: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 34350 1726853748.08418: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 34350 1726853748.08514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 34350 1726853748.08520: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 34350 1726853748.08550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 34350 1726853748.08781: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d199fad50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19bd6a20> <<< 34350 1726853748.08881: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19916630> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1990b710> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 34350 1726853748.08951: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 34350 1726853748.08954: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853748.08957: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 34350 1726853748.08980: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.09063: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.09189: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.09193: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.09209: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.09247: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.09296: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.09324: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.09358: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 34350 1726853748.09379: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.09442: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.09516: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.09537: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.09568: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 34350 1726853748.09758: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.10140: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 34350 1726853748.10176: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d199a6540> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 34350 1726853748.10192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 34350 1726853748.10267: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 34350 1726853748.10288: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 34350 1726853748.10304: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 34350 1726853748.10313: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1953c470> <<< 34350 1726853748.10341: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853748.10381: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1953c860> <<< 34350 1726853748.10442: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1998d160> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d199a70b0> <<< 34350 1726853748.10478: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d199a4cb0> <<< 34350 1726853748.10506: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d199a4860> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 34350 1726853748.10593: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 34350 1726853748.10596: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 34350 1726853748.10631: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 34350 1726853748.10740: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1953f6b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1953ef90> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1953f140> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1953e3c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 34350 1726853748.10915: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 34350 1726853748.10932: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1953f860> <<< 34350 1726853748.10954: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 34350 1726853748.11024: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853748.11029: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19586360> <<< 34350 1726853748.11068: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d195843b0> <<< 34350 1726853748.11106: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d199a4980> import 'ansible.module_utils.facts.timeout' # <<< 34350 1726853748.11156: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 34350 1726853748.11196: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.11270: stdout chunk (state=3): >>># zipimport: zlib available<<< 34350 1726853748.11303: stdout chunk (state=3): >>> <<< 34350 1726853748.11373: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 34350 1726853748.11409: stdout chunk (state=3): >>> # zipimport: zlib available <<< 34350 1726853748.11516: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.11595: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 34350 1726853748.11627: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34350 1726853748.11633: stdout chunk (state=3): >>> <<< 34350 1726853748.11665: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 34350 1726853748.11691: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34350 1726853748.11747: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34350 1726853748.11752: stdout chunk (state=3): >>> <<< 34350 1726853748.11805: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 34350 1726853748.11810: stdout chunk (state=3): >>> <<< 34350 1726853748.11833: stdout chunk (state=3): >>># zipimport: zlib available<<< 34350 1726853748.11913: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34350 1726853748.11918: stdout chunk (state=3): >>> <<< 34350 1726853748.11994: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 34350 1726853748.11999: stdout chunk (state=3): >>> <<< 34350 1726853748.12028: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.12101: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.12187: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available<<< 34350 1726853748.12284: stdout chunk (state=3): >>> # zipimport: zlib available<<< 34350 1726853748.12290: stdout chunk (state=3): >>> <<< 34350 1726853748.12388: stdout chunk (state=3): >>># zipimport: zlib available<<< 34350 1726853748.12393: stdout chunk (state=3): >>> <<< 34350 1726853748.12490: stdout chunk (state=3): >>># zipimport: zlib available<<< 34350 1726853748.12495: stdout chunk (state=3): >>> <<< 34350 1726853748.12585: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 34350 1726853748.12595: stdout chunk (state=3): >>> <<< 34350 1726853748.12611: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 34350 1726853748.12617: stdout chunk (state=3): >>> <<< 34350 1726853748.12643: stdout chunk (state=3): >>># zipimport: zlib available<<< 34350 1726853748.12797: stdout chunk (state=3): >>> <<< 34350 1726853748.13261: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.15091: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 34350 1726853748.15114: stdout chunk (state=3): >>> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19586570> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19587140> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 34350 1726853748.15400: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853748.15498: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 34350 1726853748.15527: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.15676: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.15704: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 34350 1726853748.15719: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.15773: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.15856: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 34350 1726853748.15995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 34350 1726853748.16013: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853748.16103: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853748.16106: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d195ca660> <<< 34350 1726853748.16403: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d195b89b0> <<< 34350 1726853748.16411: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 34350 1726853748.16496: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853748.16575: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 34350 1726853748.16581: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.16707: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.16829: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.17008: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.17222: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 34350 1726853748.17227: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # <<< 34350 1726853748.17297: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853748.17343: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 34350 1726853748.17348: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.17404: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.17460: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 34350 1726853748.17610: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d193aa180> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d195ca480> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 34350 1726853748.17624: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.17675: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 34350 1726853748.17680: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.17928: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.18153: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 34350 1726853748.18166: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.18311: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.18457: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.18511: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.18564: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 34350 1726853748.18573: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 34350 1726853748.18589: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.18695: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853748.18844: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.19052: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 34350 1726853748.19063: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 34350 1726853748.19066: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.19250: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.19429: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 34350 1726853748.19446: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.19481: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.19532: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.20496: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.21205: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 34350 1726853748.21227: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.21377: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.21531: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 34350 1726853748.21542: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.21689: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.21825: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 34350 1726853748.21843: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.22199: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.22283: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 34350 1726853748.22289: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.22575: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available <<< 34350 1726853748.22711: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.23026: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.23512: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 34350 1726853748.23735: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.23811: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 34350 1726853748.23880: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.23986: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 34350 1726853748.24020: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.24083: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 34350 1726853748.24176: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853748.24350: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 34350 1726853748.24863: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.25257: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 34350 1726853748.25273: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.25350: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.25431: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 34350 1726853748.25438: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.25528: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # <<< 34350 1726853748.25533: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.25576: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.25609: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 34350 1726853748.25664: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853748.25701: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 34350 1726853748.25719: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.25825: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.25941: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 34350 1726853748.25984: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853748.25994: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 34350 1726853748.26038: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.26105: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 34350 1726853748.26144: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853748.26211: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.26364: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853748.26472: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 34350 1726853748.26480: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 34350 1726853748.26490: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.26550: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.26614: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 34350 1726853748.26627: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.26935: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.27240: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 34350 1726853748.27246: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.27304: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.27369: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 34350 1726853748.27486: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # <<< 34350 1726853748.27496: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.27609: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.27719: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 34350 1726853748.27732: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 34350 1726853748.27735: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.27857: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.28089: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available <<< 34350 1726853748.29452: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 34350 1726853748.29486: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 34350 1726853748.29508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 34350 1726853748.29542: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853748.29550: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d193db860> <<< 34350 1726853748.29569: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d193d9a90> <<< 34350 1726853748.29636: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d193d9fd0> <<< 34350 1726853748.30228: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "35", "second": "48", "epoch": "1726853748", "epoch_int": "1726853748", "date": "2024-09-20", "time": "13:35:48", "iso8601_micro": "2024-09-20T17:35:48.288935Z", "iso8601": "2024-09-20T17:35:48Z", "iso8601_basic": "20240920T133548288935", "iso8601_basic_short": "20240920T133548", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "us<<< 34350 1726853748.30245: stdout chunk (state=3): >>>er", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 34350 1726853748.31407: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanu<<< 34350 1726853748.31432: stdout chunk (state=3): >>>p[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing<<< 34350 1726853748.31462: stdout chunk (state=3): >>> ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansibl<<< 34350 1726853748.31473: stdout chunk (state=3): >>>e.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 34350 1726853748.31676: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 34350 1726853748.31691: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 34350 1726853748.31719: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 34350 1726853748.31737: stdout chunk (state=3): >>># destroy _blake2 <<< 34350 1726853748.31745: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 34350 1726853748.31766: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 34350 1726853748.31807: stdout chunk (state=3): >>># destroy ntpath <<< 34350 1726853748.31833: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder <<< 34350 1726853748.31841: stdout chunk (state=3): >>># destroy json.scanner <<< 34350 1726853748.31848: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings <<< 34350 1726853748.31875: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 34350 1726853748.31924: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selinux # destroy shutil <<< 34350 1726853748.31942: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 34350 1726853748.31987: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 34350 1726853748.31998: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool <<< 34350 1726853748.32004: stdout chunk (state=3): >>># destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 34350 1726853748.32031: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process <<< 34350 1726853748.32054: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 34350 1726853748.32068: stdout chunk (state=3): >>># destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl <<< 34350 1726853748.32082: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64 <<< 34350 1726853748.32120: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 34350 1726853748.32153: stdout chunk (state=3): >>># destroy errno # destroy json # destroy socket # destroy struct # destroy glob <<< 34350 1726853748.32216: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 34350 1726853748.32250: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 34350 1726853748.32258: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 34350 1726853748.32274: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 34350 1726853748.32284: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib <<< 34350 1726853748.32295: stdout chunk (state=3): >>># cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 34350 1726853748.32319: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 34350 1726853748.32348: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 34350 1726853748.32376: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34350 1726853748.32568: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections <<< 34350 1726853748.32602: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath <<< 34350 1726853748.32607: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize <<< 34350 1726853748.32656: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing <<< 34350 1726853748.32665: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 34350 1726853748.32705: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34350 1726853748.32805: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna <<< 34350 1726853748.32830: stdout chunk (state=3): >>># destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 34350 1726853748.32869: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib <<< 34350 1726853748.32898: stdout chunk (state=3): >>># destroy _operator <<< 34350 1726853748.32902: stdout chunk (state=3): >>># destroy _sre # destroy _string # destroy re # destroy itertools <<< 34350 1726853748.32925: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins <<< 34350 1726853748.32930: stdout chunk (state=3): >>># destroy _thread # clear sys.audit hooks <<< 34350 1726853748.33428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 34350 1726853748.33457: stderr chunk (state=3): >>><<< 34350 1726853748.33460: stdout chunk (state=3): >>><<< 34350 1726853748.33567: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a5684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a537b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a56aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a31d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a31dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a35bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a35bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a393830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a393ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a373b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3712b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a359070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3b37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3b23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a372150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3b0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3e8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3582f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a3e8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3e8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a3e8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a356e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3e9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3e9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3ea540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a400740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a401e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a402cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a4032f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a402210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a403d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a4034a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3ea4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a0f3c20> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a11c6e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a11c440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a11c710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a11d040> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1a11da30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a11c8f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a0f1dc0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a11ee40> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a11db80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a3ea6f0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a1471a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a16b560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a1cc2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a1cea50> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a1cc410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a1912e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19b113d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a16a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1a11fd70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7d1a16a6c0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_q48vb6l8/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19b7b0b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19b59fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19b59100> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19b78f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19baa9f0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19baa7b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19baa0c0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19baa510> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19b7bd40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19bab740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19bab980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19babec0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a19ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19a1b8c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a1c2c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a1d460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a1fef0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19a24230> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a1e1b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a27e90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a26960> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a266c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a26c30> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a1e6c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19a6be90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a6c110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19a6dca0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a6da60> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19a70260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a6e390> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a739e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a703e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19a74800> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19a74ad0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19a74b00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a6c410> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d199042f0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19905310> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a76a80> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19a77e30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19a766c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19909700> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1990aab0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d199055b0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1990ac90> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1990b830> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19916480> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19911a60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d199fad50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19bd6a20> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19916630> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1990b710> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d199a6540> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1953c470> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1953c860> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1998d160> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d199a70b0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d199a4cb0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d199a4860> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1953f6b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1953ef90> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d1953f140> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1953e3c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d1953f860> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d19586360> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d195843b0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d199a4980> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19586570> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d19587140> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d195ca660> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d195b89b0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d193aa180> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d195ca480> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7d193db860> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d193d9a90> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7d193d9fd0> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "35", "second": "48", "epoch": "1726853748", "epoch_int": "1726853748", "date": "2024-09-20", "time": "13:35:48", "iso8601_micro": "2024-09-20T17:35:48.288935Z", "iso8601": "2024-09-20T17:35:48Z", "iso8601_basic": "20240920T133548288935", "iso8601_basic_short": "20240920T133548", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 34350 1726853748.34379: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853747.7185016-34421-123272660633076/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34350 1726853748.34382: _low_level_execute_command(): starting 34350 1726853748.34385: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853747.7185016-34421-123272660633076/ > /dev/null 2>&1 && sleep 0' 34350 1726853748.34387: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34350 1726853748.34389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 34350 1726853748.34391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 34350 1726853748.34402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34350 1726853748.34425: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853748.34428: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34350 1726853748.34430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 34350 1726853748.34438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853748.34494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 34350 1726853748.34497: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34350 1726853748.34551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34350 1726853748.37095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34350 1726853748.37124: stderr chunk (state=3): >>><<< 34350 1726853748.37127: stdout chunk (state=3): >>><<< 34350 1726853748.37140: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34350 1726853748.37145: handler run complete 34350 1726853748.37177: variable 'ansible_facts' from source: unknown 34350 1726853748.37213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853748.37288: variable 'ansible_facts' from source: unknown 34350 1726853748.37317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853748.37352: attempt loop complete, returning result 34350 1726853748.37355: _execute() done 34350 1726853748.37358: dumping result to json 34350 1726853748.37369: done dumping result, returning 34350 1726853748.37378: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [02083763-bbaf-b6c1-0de4-000000000158] 34350 1726853748.37381: sending task result for task 02083763-bbaf-b6c1-0de4-000000000158 34350 1726853748.37502: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000158 34350 1726853748.37504: WORKER PROCESS EXITING ok: [managed_node1] 34350 1726853748.37600: no more pending results, returning what we have 34350 1726853748.37603: results queue empty 34350 1726853748.37604: checking for any_errors_fatal 34350 1726853748.37606: done checking for any_errors_fatal 34350 1726853748.37606: checking for max_fail_percentage 34350 1726853748.37607: done checking for max_fail_percentage 34350 1726853748.37608: checking to see if all hosts have failed and the running result is not ok 34350 1726853748.37609: done checking to see if all hosts have failed 34350 1726853748.37609: getting the remaining hosts for this loop 34350 1726853748.37611: done getting the remaining hosts for this loop 34350 1726853748.37623: getting the next task for host managed_node1 34350 1726853748.37632: done getting next task for host managed_node1 34350 1726853748.37634: ^ task is: TASK: Check if system is ostree 34350 1726853748.37637: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853748.37640: getting variables 34350 1726853748.37641: in VariableManager get_vars() 34350 1726853748.37667: Calling all_inventory to load vars for managed_node1 34350 1726853748.37669: Calling groups_inventory to load vars for managed_node1 34350 1726853748.37674: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853748.37684: Calling all_plugins_play to load vars for managed_node1 34350 1726853748.37686: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853748.37688: Calling groups_plugins_play to load vars for managed_node1 34350 1726853748.37825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853748.37945: done with get_vars() 34350 1726853748.37953: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 13:35:48 -0400 (0:00:00.741) 0:00:02.252 ****** 34350 1726853748.38019: entering _queue_task() for managed_node1/stat 34350 1726853748.38215: worker is 1 (out of 1 available) 34350 1726853748.38227: exiting _queue_task() for managed_node1/stat 34350 1726853748.38239: done queuing things up, now waiting for results queue to drain 34350 1726853748.38240: waiting for pending results... 34350 1726853748.38388: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 34350 1726853748.38451: in run() - task 02083763-bbaf-b6c1-0de4-00000000015a 34350 1726853748.38466: variable 'ansible_search_path' from source: unknown 34350 1726853748.38469: variable 'ansible_search_path' from source: unknown 34350 1726853748.38494: calling self._execute() 34350 1726853748.38550: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853748.38554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853748.38563: variable 'omit' from source: magic vars 34350 1726853748.38901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34350 1726853748.39079: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34350 1726853748.39112: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34350 1726853748.39137: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34350 1726853748.39163: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34350 1726853748.39246: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34350 1726853748.39272: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34350 1726853748.39292: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34350 1726853748.39310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34350 1726853748.39575: Evaluated conditional (not __network_is_ostree is defined): True 34350 1726853748.39578: variable 'omit' from source: magic vars 34350 1726853748.39581: variable 'omit' from source: magic vars 34350 1726853748.39583: variable 'omit' from source: magic vars 34350 1726853748.39585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34350 1726853748.39587: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34350 1726853748.39589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34350 1726853748.39622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34350 1726853748.39635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34350 1726853748.39669: variable 'inventory_hostname' from source: host vars for 'managed_node1' 34350 1726853748.39690: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853748.39697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853748.39876: Set connection var ansible_timeout to 10 34350 1726853748.39880: Set connection var ansible_module_compression to ZIP_DEFLATED 34350 1726853748.39882: Set connection var ansible_pipelining to False 34350 1726853748.39885: Set connection var ansible_shell_executable to /bin/sh 34350 1726853748.39887: Set connection var ansible_connection to ssh 34350 1726853748.39889: Set connection var ansible_shell_type to sh 34350 1726853748.39891: variable 'ansible_shell_executable' from source: unknown 34350 1726853748.39893: variable 'ansible_connection' from source: unknown 34350 1726853748.39896: variable 'ansible_module_compression' from source: unknown 34350 1726853748.39903: variable 'ansible_shell_type' from source: unknown 34350 1726853748.39905: variable 'ansible_shell_executable' from source: unknown 34350 1726853748.39907: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853748.39909: variable 'ansible_pipelining' from source: unknown 34350 1726853748.39911: variable 'ansible_timeout' from source: unknown 34350 1726853748.39920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853748.40065: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 34350 1726853748.40083: variable 'omit' from source: magic vars 34350 1726853748.40093: starting attempt loop 34350 1726853748.40099: running the handler 34350 1726853748.40117: _low_level_execute_command(): starting 34350 1726853748.40132: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 34350 1726853748.40851: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34350 1726853748.40883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34350 1726853748.40887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853748.40941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 34350 1726853748.40944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34350 1726853748.40949: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34350 1726853748.40997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34350 1726853748.43121: stdout chunk (state=3): >>>/root <<< 34350 1726853748.43172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34350 1726853748.43195: stderr chunk (state=3): >>><<< 34350 1726853748.43199: stdout chunk (state=3): >>><<< 34350 1726853748.43217: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34350 1726853748.43227: _low_level_execute_command(): starting 34350 1726853748.43233: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853748.4321609-34441-59602319690345 `" && echo ansible-tmp-1726853748.4321609-34441-59602319690345="` echo /root/.ansible/tmp/ansible-tmp-1726853748.4321609-34441-59602319690345 `" ) && sleep 0' 34350 1726853748.43645: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34350 1726853748.43648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853748.43651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 34350 1726853748.43653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 34350 1726853748.43655: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853748.43704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 34350 1726853748.43707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34350 1726853748.43757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 34350 1726853748.46181: stdout chunk (state=3): >>>ansible-tmp-1726853748.4321609-34441-59602319690345=/root/.ansible/tmp/ansible-tmp-1726853748.4321609-34441-59602319690345 <<< 34350 1726853748.46312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34350 1726853748.46323: stdout chunk (state=3): >>><<< 34350 1726853748.46336: stderr chunk (state=3): >>><<< 34350 1726853748.46394: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853748.4321609-34441-59602319690345=/root/.ansible/tmp/ansible-tmp-1726853748.4321609-34441-59602319690345 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 34350 1726853748.46582: variable 'ansible_module_compression' from source: unknown 34350 1726853748.46585: ANSIBALLZ: Using lock for stat 34350 1726853748.46587: ANSIBALLZ: Acquiring lock 34350 1726853748.46589: ANSIBALLZ: Lock acquired: 140478516368480 34350 1726853748.46591: ANSIBALLZ: Creating module 34350 1726853748.70089: ANSIBALLZ: Writing module into payload 34350 1726853748.70204: ANSIBALLZ: Writing module 34350 1726853748.70230: ANSIBALLZ: Renaming module 34350 1726853748.70243: ANSIBALLZ: Done creating module 34350 1726853748.70272: variable 'ansible_facts' from source: unknown 34350 1726853748.70356: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853748.4321609-34441-59602319690345/AnsiballZ_stat.py 34350 1726853748.70577: Sending initial data 34350 1726853748.70580: Sent initial data (152 bytes) 34350 1726853748.71100: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34350 1726853748.71113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 34350 1726853748.71125: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853748.71174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 34350 1726853748.71187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 34350 1726853748.71239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34350 1726853748.73212: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 34350 1726853748.73268: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 34350 1726853748.73321: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-34350pt_rq5b8/tmp93tpv38s /root/.ansible/tmp/ansible-tmp-1726853748.4321609-34441-59602319690345/AnsiballZ_stat.py <<< 34350 1726853748.73326: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853748.4321609-34441-59602319690345/AnsiballZ_stat.py" <<< 34350 1726853748.73375: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-34350pt_rq5b8/tmp93tpv38s" to remote "/root/.ansible/tmp/ansible-tmp-1726853748.4321609-34441-59602319690345/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853748.4321609-34441-59602319690345/AnsiballZ_stat.py" <<< 34350 1726853748.74070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34350 1726853748.74119: stderr chunk (state=3): >>><<< 34350 1726853748.74126: stdout chunk (state=3): >>><<< 34350 1726853748.74176: done transferring module to remote 34350 1726853748.74179: _low_level_execute_command(): starting 34350 1726853748.74185: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853748.4321609-34441-59602319690345/ /root/.ansible/tmp/ansible-tmp-1726853748.4321609-34441-59602319690345/AnsiballZ_stat.py && sleep 0' 34350 1726853748.74553: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34350 1726853748.74586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 34350 1726853748.74589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853748.74591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 34350 1726853748.74593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853748.74637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 34350 1726853748.74646: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34350 1726853748.74694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34350 1726853748.77056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34350 1726853748.77093: stderr chunk (state=3): >>><<< 34350 1726853748.77096: stdout chunk (state=3): >>><<< 34350 1726853748.77106: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34350 1726853748.77167: _low_level_execute_command(): starting 34350 1726853748.77173: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853748.4321609-34441-59602319690345/AnsiballZ_stat.py && sleep 0' 34350 1726853748.77513: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 34350 1726853748.77533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 34350 1726853748.77582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 34350 1726853748.77588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 34350 1726853748.77598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34350 1726853748.77655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34350 1726853748.80681: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 34350 1726853748.80735: stdout chunk (state=3): >>>import _imp # builtin <<< 34350 1726853748.80784: stdout chunk (state=3): >>>import '_thread' # <<< 34350 1726853748.80811: stdout chunk (state=3): >>>import '_warnings' # <<< 34350 1726853748.80816: stdout chunk (state=3): >>>import '_weakref' # <<< 34350 1726853748.80911: stdout chunk (state=3): >>>import '_io' # <<< 34350 1726853748.80926: stdout chunk (state=3): >>> import 'marshal' # <<< 34350 1726853748.80983: stdout chunk (state=3): >>> import 'posix' # <<< 34350 1726853748.80988: stdout chunk (state=3): >>> <<< 34350 1726853748.81037: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 34350 1726853748.81045: stdout chunk (state=3): >>> <<< 34350 1726853748.81055: stdout chunk (state=3): >>># installing zipimport hook <<< 34350 1726853748.81109: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 34350 1726853748.81112: stdout chunk (state=3): >>> # installed zipimport hook<<< 34350 1726853748.81176: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 34350 1726853748.81198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc'<<< 34350 1726853748.81206: stdout chunk (state=3): >>> <<< 34350 1726853748.81235: stdout chunk (state=3): >>>import '_codecs' # <<< 34350 1726853748.81401: stdout chunk (state=3): >>> import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5bbc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5b8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5bbea50> import '_signal' # <<< 34350 1726853748.81445: stdout chunk (state=3): >>>import '_abc' # <<< 34350 1726853748.81451: stdout chunk (state=3): >>> import 'abc' # <<< 34350 1726853748.81481: stdout chunk (state=3): >>> import 'io' # <<< 34350 1726853748.81530: stdout chunk (state=3): >>>import '_stat' # <<< 34350 1726853748.81539: stdout chunk (state=3): >>> import 'stat' # <<< 34350 1726853748.81621: stdout chunk (state=3): >>> <<< 34350 1726853748.81701: stdout chunk (state=3): >>>import '_collections_abc' # <<< 34350 1726853748.81706: stdout chunk (state=3): >>> <<< 34350 1726853748.81747: stdout chunk (state=3): >>>import 'genericpath' # <<< 34350 1726853748.81764: stdout chunk (state=3): >>> <<< 34350 1726853748.81767: stdout chunk (state=3): >>>import 'posixpath' # <<< 34350 1726853748.81774: stdout chunk (state=3): >>> <<< 34350 1726853748.81814: stdout chunk (state=3): >>>import 'os' # <<< 34350 1726853748.81819: stdout chunk (state=3): >>> <<< 34350 1726853748.81858: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages<<< 34350 1726853748.81865: stdout chunk (state=3): >>> <<< 34350 1726853748.81886: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages'<<< 34350 1726853748.81906: stdout chunk (state=3): >>> Adding directory: '/usr/lib/python3.12/site-packages'<<< 34350 1726853748.81918: stdout chunk (state=3): >>> Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth'<<< 34350 1726853748.81954: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py<<< 34350 1726853748.81963: stdout chunk (state=3): >>> <<< 34350 1726853748.81984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 34350 1726853748.82081: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5bcd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 34350 1726853748.82111: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853748.82133: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5bcdfa0> <<< 34350 1726853748.82178: stdout chunk (state=3): >>>import 'site' # <<< 34350 1726853748.82227: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux <<< 34350 1726853748.82412: stdout chunk (state=3): >>>Type "help", "copyright", "credits" or "license" for more information. <<< 34350 1726853748.82621: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 34350 1726853748.82672: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 34350 1726853748.82713: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 34350 1726853748.82716: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853748.82749: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 34350 1726853748.82813: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 34350 1726853748.82850: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py<<< 34350 1726853748.82890: stdout chunk (state=3): >>> <<< 34350 1726853748.82916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 34350 1726853748.82930: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc59cbec0> <<< 34350 1726853748.82954: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 34350 1726853748.82988: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc'<<< 34350 1726853748.83047: stdout chunk (state=3): >>> import '_operator' # <<< 34350 1726853748.83050: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc59cbf80> <<< 34350 1726853748.83139: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 34350 1726853748.83169: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py<<< 34350 1726853748.83240: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853748.83263: stdout chunk (state=3): >>>import 'itertools' # <<< 34350 1726853748.83299: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py<<< 34350 1726853748.83303: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc'<<< 34350 1726853748.83332: stdout chunk (state=3): >>> import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a038c0><<< 34350 1726853748.83335: stdout chunk (state=3): >>> <<< 34350 1726853748.83353: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py<<< 34350 1726853748.83377: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 34350 1726853748.83411: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a03f50> import '_collections' # <<< 34350 1726853748.83491: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc59e3b90> <<< 34350 1726853748.83496: stdout chunk (state=3): >>>import '_functools' # <<< 34350 1726853748.83528: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc59e12b0> <<< 34350 1726853748.83814: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc59c9070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a23800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a22420> <<< 34350 1726853748.83843: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 34350 1726853748.83908: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc59e2180> <<< 34350 1726853748.83921: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc59c8260> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a58860> <<< 34350 1726853748.83924: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc59c82f0> <<< 34350 1726853748.83948: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 34350 1726853748.83954: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 34350 1726853748.83988: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853748.83992: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5a58d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a58bc0> <<< 34350 1726853748.84027: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5a58f80> <<< 34350 1726853748.84073: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc59c6e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853748.84097: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 34350 1726853748.84138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a59610> <<< 34350 1726853748.84182: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a592e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 34350 1726853748.84208: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a5a510> <<< 34350 1726853748.84213: stdout chunk (state=3): >>>import 'importlib.util' # <<< 34350 1726853748.84253: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 34350 1726853748.84292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 34350 1726853748.84338: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a70710> import 'errno' # <<< 34350 1726853748.84373: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5a71df0> <<< 34350 1726853748.84401: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 34350 1726853748.84433: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 34350 1726853748.84482: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a72c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853748.84487: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5a732f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a721e0> <<< 34350 1726853748.84516: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 34350 1726853748.84567: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853748.84576: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5a73d70> <<< 34350 1726853748.84582: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a734a0> <<< 34350 1726853748.84640: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a5a540> <<< 34350 1726853748.84683: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 34350 1726853748.84707: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 34350 1726853748.84729: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 34350 1726853748.84790: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc57fbc80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 34350 1726853748.84847: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc58246e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5824440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5824560> <<< 34350 1726853748.84879: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 34350 1726853748.84898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 34350 1726853748.85099: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853748.85147: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5825010> <<< 34350 1726853748.85315: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5825a00> <<< 34350 1726853748.85318: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc58248c0> <<< 34350 1726853748.85340: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc57f9e20> <<< 34350 1726853748.85367: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 34350 1726853748.85388: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 34350 1726853748.85416: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 34350 1726853748.85420: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 34350 1726853748.85439: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5826de0> <<< 34350 1726853748.85460: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5825b50> <<< 34350 1726853748.85488: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a5ac30> <<< 34350 1726853748.85509: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 34350 1726853748.85594: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853748.85650: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 34350 1726853748.85688: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5853140> <<< 34350 1726853748.85752: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 34350 1726853748.85774: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853748.85795: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 34350 1726853748.85872: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5873500> <<< 34350 1726853748.85893: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 34350 1726853748.85952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 34350 1726853748.86029: stdout chunk (state=3): >>>import 'ntpath' # <<< 34350 1726853748.86055: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc58d42c0> <<< 34350 1726853748.86111: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 34350 1726853748.86140: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 34350 1726853748.86191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 34350 1726853748.86416: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc58d6a20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc58d43e0> <<< 34350 1726853748.86468: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc58992b0> <<< 34350 1726853748.86491: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc51213d0> <<< 34350 1726853748.86514: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5872300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5827d40> <<< 34350 1726853748.86681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 34350 1726853748.86701: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5bc5872660> <<< 34350 1726853748.87050: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_uzlh8mkd/ansible_stat_payload.zip' # zipimport: zlib available <<< 34350 1726853748.87264: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.87289: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 34350 1726853748.87318: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 34350 1726853748.87363: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 34350 1726853748.87502: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc51770b0> <<< 34350 1726853748.87506: stdout chunk (state=3): >>>import '_typing' # <<< 34350 1726853748.87773: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5155fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5155100> <<< 34350 1726853748.87813: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # <<< 34350 1726853748.87820: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.87854: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853748.87877: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 34350 1726853748.90032: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.91801: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5174f50> <<< 34350 1726853748.91849: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 34350 1726853748.91882: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 34350 1726853748.91956: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc519eab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc519e840> <<< 34350 1726853748.91991: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc519e150> <<< 34350 1726853748.92018: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 34350 1726853748.92023: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 34350 1726853748.92068: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc519e8d0> <<< 34350 1726853748.92074: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5177ad0> <<< 34350 1726853748.92107: stdout chunk (state=3): >>>import 'atexit' # <<< 34350 1726853748.92112: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc519f800> <<< 34350 1726853748.92148: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc519fa10> <<< 34350 1726853748.92216: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 34350 1726853748.92237: stdout chunk (state=3): >>>import '_locale' # <<< 34350 1726853748.92288: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc519ff50> <<< 34350 1726853748.92302: stdout chunk (state=3): >>>import 'pwd' # <<< 34350 1726853748.92325: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 34350 1726853748.92393: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5009d90> <<< 34350 1726853748.92424: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853748.92429: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc500b9b0> <<< 34350 1726853748.92446: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 34350 1726853748.92508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc500c380> <<< 34350 1726853748.92528: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 34350 1726853748.92559: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 34350 1726853748.92585: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc500d520> <<< 34350 1726853748.92643: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 34350 1726853748.92669: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 34350 1726853748.92675: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 34350 1726853748.92740: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc500ff50> <<< 34350 1726853748.92785: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc58242f0> <<< 34350 1726853748.92930: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc500e240> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 34350 1726853748.92960: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 34350 1726853748.92967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5017ec0> <<< 34350 1726853748.93061: stdout chunk (state=3): >>>import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5016990> <<< 34350 1726853748.93075: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc50166f0> <<< 34350 1726853748.93090: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 34350 1726853748.93096: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 34350 1726853748.93200: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5016c60> <<< 34350 1726853748.93241: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc500e750> <<< 34350 1726853748.93267: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853748.93273: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc505ffe0> <<< 34350 1726853748.93293: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5060200> <<< 34350 1726853748.93344: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 34350 1726853748.93361: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 34350 1726853748.93419: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5061d00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5061ac0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 34350 1726853748.93627: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853748.93633: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5064260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc50623c0> <<< 34350 1726853748.93657: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 34350 1726853748.93703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853748.93726: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 34350 1726853748.93747: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 34350 1726853748.93899: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5067a40> <<< 34350 1726853748.93988: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5064410> <<< 34350 1726853748.94067: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853748.94074: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5068800> <<< 34350 1726853748.94101: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5068a40> <<< 34350 1726853748.94153: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5068dd0> <<< 34350 1726853748.94167: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5060440> <<< 34350 1726853748.94196: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 34350 1726853748.94239: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 34350 1726853748.94268: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853748.94291: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc50f43e0> <<< 34350 1726853748.94529: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc50f5700> <<< 34350 1726853748.94567: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc506ab70> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853748.94574: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc506bf20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc506a7e0> <<< 34350 1726853748.94600: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853748.94605: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 34350 1726853748.94744: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853748.94878: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853748.94882: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 34350 1726853748.94894: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.94911: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 34350 1726853748.94933: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.95105: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.95289: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.96154: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.97034: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 34350 1726853748.97039: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 34350 1726853748.97046: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 34350 1726853748.97062: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 34350 1726853748.97085: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853748.97250: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc50f9940> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 34350 1726853748.97274: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc50fa660> <<< 34350 1726853748.97280: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc50f5910> <<< 34350 1726853748.97345: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 34350 1726853748.97378: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.97385: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 34350 1726853748.97402: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.97617: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.97848: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 34350 1726853748.97870: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc50fa600> <<< 34350 1726853748.97994: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.98595: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.99298: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.99401: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.99502: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 34350 1726853748.99518: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.99560: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.99607: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 34350 1726853748.99612: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.99712: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.99823: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 34350 1726853748.99843: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.99854: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853748.99865: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 34350 1726853748.99919: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853748.99965: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 34350 1726853748.99969: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853749.00328: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853749.00678: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 34350 1726853749.00777: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 34350 1726853749.00866: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc50fb8f0> <<< 34350 1726853749.00978: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 34350 1726853749.01081: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 34350 1726853749.01085: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 34350 1726853749.01107: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 34350 1726853749.01120: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853749.01169: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853749.01221: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 34350 1726853749.01227: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853749.01279: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853749.01341: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853749.01412: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853749.01562: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 34350 1726853749.01672: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 34350 1726853749.01678: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc4f06270> <<< 34350 1726853749.01717: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc4f03500> <<< 34350 1726853749.01754: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 34350 1726853749.01797: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853749.01854: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853749.01942: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853749.01973: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853749.02048: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 34350 1726853749.02074: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 34350 1726853749.02170: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 34350 1726853749.02194: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 34350 1726853749.02208: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 34350 1726853749.02291: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc51d6ba0> <<< 34350 1726853749.02349: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc51ee870> <<< 34350 1726853749.02458: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc4f063f0> <<< 34350 1726853749.02465: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc50fb1d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 34350 1726853749.02473: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853749.02508: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853749.02602: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 34350 1726853749.02614: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 34350 1726853749.02632: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853749.02638: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 34350 1726853749.02659: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853749.02854: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853749.03148: stdout chunk (state=3): >>># zipimport: zlib available <<< 34350 1726853749.03287: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 34350 1726853749.03397: stdout chunk (state=3): >>># destroy __main__ <<< 34350 1726853749.03758: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 34350 1726853749.03780: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin <<< 34350 1726853749.03811: stdout chunk (state=3): >>># restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath <<< 34350 1726853749.03854: stdout chunk (state=3): >>># cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 <<< 34350 1726853749.03896: stdout chunk (state=3): >>># cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd <<< 34350 1726853749.03922: stdout chunk (state=3): >>># cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 34350 1726853749.04229: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 34350 1726853749.04240: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 34350 1726853749.04258: stdout chunk (state=3): >>># destroy _bz2 <<< 34350 1726853749.04269: stdout chunk (state=3): >>># destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 34350 1726853749.04306: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath <<< 34350 1726853749.04520: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections <<< 34350 1726853749.04546: stdout chunk (state=3): >>># destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 34350 1726853749.04566: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs <<< 34350 1726853749.04570: stdout chunk (state=3): >>># cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread <<< 34350 1726853749.04584: stdout chunk (state=3): >>># cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 34350 1726853749.04591: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 34350 1726853749.04739: stdout chunk (state=3): >>># destroy sys.monitoring <<< 34350 1726853749.04765: stdout chunk (state=3): >>># destroy _socket <<< 34350 1726853749.04767: stdout chunk (state=3): >>># destroy _collections <<< 34350 1726853749.04794: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 34350 1726853749.04801: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 34350 1726853749.04822: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 34350 1726853749.04853: stdout chunk (state=3): >>># destroy _typing <<< 34350 1726853749.04869: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 34350 1726853749.04879: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 34350 1726853749.04884: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 34350 1726853749.04994: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 <<< 34350 1726853749.05011: stdout chunk (state=3): >>># destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 34350 1726853749.05024: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 34350 1726853749.05039: stdout chunk (state=3): >>># destroy _random <<< 34350 1726853749.05043: stdout chunk (state=3): >>># destroy _weakref <<< 34350 1726853749.05084: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools <<< 34350 1726853749.05098: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix <<< 34350 1726853749.05104: stdout chunk (state=3): >>># destroy _functools # destroy builtins # destroy _thread <<< 34350 1726853749.05196: stdout chunk (state=3): >>># clear sys.audit hooks <<< 34350 1726853749.05548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 34350 1726853749.05581: stderr chunk (state=3): >>><<< 34350 1726853749.05584: stdout chunk (state=3): >>><<< 34350 1726853749.05645: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5bbc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5b8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5bbea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5bcd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5bcdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc59cbec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc59cbf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a038c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a03f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc59e3b90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc59e12b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc59c9070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a23800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a22420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc59e2180> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc59c8260> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a58860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc59c82f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5a58d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a58bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5a58f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc59c6e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a59610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a592e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a5a510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a70710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5a71df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a72c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5a732f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a721e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5a73d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a734a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a5a540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc57fbc80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc58246e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5824440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5824560> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5825010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5825a00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc58248c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc57f9e20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5826de0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5825b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5a5ac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5853140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5873500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc58d42c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc58d6a20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc58d43e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc58992b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc51213d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5872300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5827d40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5bc5872660> # zipimport: found 30 names in '/tmp/ansible_stat_payload_uzlh8mkd/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc51770b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5155fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5155100> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5174f50> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc519eab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc519e840> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc519e150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc519e8d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5177ad0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc519f800> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc519fa10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc519ff50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5009d90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc500b9b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc500c380> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc500d520> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc500ff50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc58242f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc500e240> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5017ec0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5016990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc50166f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5016c60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc500e750> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc505ffe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5060200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5061d00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5061ac0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5064260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc50623c0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5067a40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5064410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5068800> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5068a40> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc5068dd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc5060440> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc50f43e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc50f5700> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc506ab70> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc506bf20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc506a7e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc50f9940> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc50fa660> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc50f5910> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc50fa600> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc50fb8f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5bc4f06270> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc4f03500> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc51d6ba0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc51ee870> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc4f063f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5bc50fb1d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 34350 1726853749.06157: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853748.4321609-34441-59602319690345/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 34350 1726853749.06161: _low_level_execute_command(): starting 34350 1726853749.06163: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853748.4321609-34441-59602319690345/ > /dev/null 2>&1 && sleep 0' 34350 1726853749.06312: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 34350 1726853749.06436: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 34350 1726853749.06463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 34350 1726853749.06552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 34350 1726853749.08887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 34350 1726853749.08940: stderr chunk (state=3): >>><<< 34350 1726853749.08944: stdout chunk (state=3): >>><<< 34350 1726853749.08964: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 34350 1726853749.08973: handler run complete 34350 1726853749.08993: attempt loop complete, returning result 34350 1726853749.08997: _execute() done 34350 1726853749.08999: dumping result to json 34350 1726853749.09001: done dumping result, returning 34350 1726853749.09011: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [02083763-bbaf-b6c1-0de4-00000000015a] 34350 1726853749.09014: sending task result for task 02083763-bbaf-b6c1-0de4-00000000015a 34350 1726853749.09232: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000015a 34350 1726853749.09235: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 34350 1726853749.09336: no more pending results, returning what we have 34350 1726853749.09340: results queue empty 34350 1726853749.09341: checking for any_errors_fatal 34350 1726853749.09348: done checking for any_errors_fatal 34350 1726853749.09349: checking for max_fail_percentage 34350 1726853749.09351: done checking for max_fail_percentage 34350 1726853749.09351: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.09352: done checking to see if all hosts have failed 34350 1726853749.09353: getting the remaining hosts for this loop 34350 1726853749.09355: done getting the remaining hosts for this loop 34350 1726853749.09361: getting the next task for host managed_node1 34350 1726853749.09368: done getting next task for host managed_node1 34350 1726853749.09373: ^ task is: TASK: Set flag to indicate system is ostree 34350 1726853749.09375: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.09379: getting variables 34350 1726853749.09381: in VariableManager get_vars() 34350 1726853749.09529: Calling all_inventory to load vars for managed_node1 34350 1726853749.09532: Calling groups_inventory to load vars for managed_node1 34350 1726853749.09536: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.09547: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.09550: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.09553: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.09839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.10042: done with get_vars() 34350 1726853749.10053: done getting variables 34350 1726853749.10153: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 13:35:49 -0400 (0:00:00.721) 0:00:02.974 ****** 34350 1726853749.10187: entering _queue_task() for managed_node1/set_fact 34350 1726853749.10189: Creating lock for set_fact 34350 1726853749.10466: worker is 1 (out of 1 available) 34350 1726853749.10680: exiting _queue_task() for managed_node1/set_fact 34350 1726853749.10690: done queuing things up, now waiting for results queue to drain 34350 1726853749.10691: waiting for pending results... 34350 1726853749.10755: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 34350 1726853749.10876: in run() - task 02083763-bbaf-b6c1-0de4-00000000015b 34350 1726853749.10900: variable 'ansible_search_path' from source: unknown 34350 1726853749.10910: variable 'ansible_search_path' from source: unknown 34350 1726853749.10949: calling self._execute() 34350 1726853749.11029: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.11040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.11054: variable 'omit' from source: magic vars 34350 1726853749.11678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 34350 1726853749.11848: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 34350 1726853749.11903: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 34350 1726853749.11941: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 34350 1726853749.11982: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 34350 1726853749.12080: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 34350 1726853749.12116: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 34350 1726853749.12148: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 34350 1726853749.12187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 34350 1726853749.12315: Evaluated conditional (not __network_is_ostree is defined): True 34350 1726853749.12331: variable 'omit' from source: magic vars 34350 1726853749.12376: variable 'omit' from source: magic vars 34350 1726853749.12501: variable '__ostree_booted_stat' from source: set_fact 34350 1726853749.12561: variable 'omit' from source: magic vars 34350 1726853749.12654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34350 1726853749.12657: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34350 1726853749.12662: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34350 1726853749.12665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34350 1726853749.12679: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34350 1726853749.12712: variable 'inventory_hostname' from source: host vars for 'managed_node1' 34350 1726853749.12720: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.12729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.12833: Set connection var ansible_timeout to 10 34350 1726853749.12845: Set connection var ansible_module_compression to ZIP_DEFLATED 34350 1726853749.12856: Set connection var ansible_pipelining to False 34350 1726853749.12870: Set connection var ansible_shell_executable to /bin/sh 34350 1726853749.12886: Set connection var ansible_connection to ssh 34350 1726853749.12893: Set connection var ansible_shell_type to sh 34350 1726853749.12916: variable 'ansible_shell_executable' from source: unknown 34350 1726853749.12986: variable 'ansible_connection' from source: unknown 34350 1726853749.12989: variable 'ansible_module_compression' from source: unknown 34350 1726853749.12992: variable 'ansible_shell_type' from source: unknown 34350 1726853749.12994: variable 'ansible_shell_executable' from source: unknown 34350 1726853749.12996: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.12998: variable 'ansible_pipelining' from source: unknown 34350 1726853749.12999: variable 'ansible_timeout' from source: unknown 34350 1726853749.13001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.13069: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34350 1726853749.13087: variable 'omit' from source: magic vars 34350 1726853749.13104: starting attempt loop 34350 1726853749.13111: running the handler 34350 1726853749.13128: handler run complete 34350 1726853749.13141: attempt loop complete, returning result 34350 1726853749.13148: _execute() done 34350 1726853749.13155: dumping result to json 34350 1726853749.13205: done dumping result, returning 34350 1726853749.13208: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [02083763-bbaf-b6c1-0de4-00000000015b] 34350 1726853749.13210: sending task result for task 02083763-bbaf-b6c1-0de4-00000000015b 34350 1726853749.13275: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000015b ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 34350 1726853749.13366: no more pending results, returning what we have 34350 1726853749.13369: results queue empty 34350 1726853749.13370: checking for any_errors_fatal 34350 1726853749.13378: done checking for any_errors_fatal 34350 1726853749.13379: checking for max_fail_percentage 34350 1726853749.13381: done checking for max_fail_percentage 34350 1726853749.13381: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.13383: done checking to see if all hosts have failed 34350 1726853749.13383: getting the remaining hosts for this loop 34350 1726853749.13385: done getting the remaining hosts for this loop 34350 1726853749.13388: getting the next task for host managed_node1 34350 1726853749.13399: done getting next task for host managed_node1 34350 1726853749.13401: ^ task is: TASK: Fix CentOS6 Base repo 34350 1726853749.13404: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.13407: getting variables 34350 1726853749.13409: in VariableManager get_vars() 34350 1726853749.13438: Calling all_inventory to load vars for managed_node1 34350 1726853749.13441: Calling groups_inventory to load vars for managed_node1 34350 1726853749.13445: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.13455: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.13460: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.13463: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.13728: WORKER PROCESS EXITING 34350 1726853749.14106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.14509: done with get_vars() 34350 1726853749.14518: done getting variables 34350 1726853749.14632: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 13:35:49 -0400 (0:00:00.044) 0:00:03.019 ****** 34350 1726853749.14662: entering _queue_task() for managed_node1/copy 34350 1726853749.15193: worker is 1 (out of 1 available) 34350 1726853749.15204: exiting _queue_task() for managed_node1/copy 34350 1726853749.15214: done queuing things up, now waiting for results queue to drain 34350 1726853749.15215: waiting for pending results... 34350 1726853749.15662: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 34350 1726853749.15778: in run() - task 02083763-bbaf-b6c1-0de4-00000000015d 34350 1726853749.15806: variable 'ansible_search_path' from source: unknown 34350 1726853749.15815: variable 'ansible_search_path' from source: unknown 34350 1726853749.15853: calling self._execute() 34350 1726853749.16029: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.16032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.16035: variable 'omit' from source: magic vars 34350 1726853749.16415: variable 'ansible_distribution' from source: facts 34350 1726853749.16439: Evaluated conditional (ansible_distribution == 'CentOS'): True 34350 1726853749.16563: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.16577: Evaluated conditional (ansible_distribution_major_version == '6'): False 34350 1726853749.16588: when evaluation is False, skipping this task 34350 1726853749.16595: _execute() done 34350 1726853749.16693: dumping result to json 34350 1726853749.16697: done dumping result, returning 34350 1726853749.16699: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [02083763-bbaf-b6c1-0de4-00000000015d] 34350 1726853749.16702: sending task result for task 02083763-bbaf-b6c1-0de4-00000000015d 34350 1726853749.16768: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000015d 34350 1726853749.16773: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 34350 1726853749.16857: no more pending results, returning what we have 34350 1726853749.16863: results queue empty 34350 1726853749.16864: checking for any_errors_fatal 34350 1726853749.16868: done checking for any_errors_fatal 34350 1726853749.16869: checking for max_fail_percentage 34350 1726853749.16872: done checking for max_fail_percentage 34350 1726853749.16873: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.16874: done checking to see if all hosts have failed 34350 1726853749.16874: getting the remaining hosts for this loop 34350 1726853749.16877: done getting the remaining hosts for this loop 34350 1726853749.16880: getting the next task for host managed_node1 34350 1726853749.16888: done getting next task for host managed_node1 34350 1726853749.16891: ^ task is: TASK: Include the task 'enable_epel.yml' 34350 1726853749.16894: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.16898: getting variables 34350 1726853749.16900: in VariableManager get_vars() 34350 1726853749.16929: Calling all_inventory to load vars for managed_node1 34350 1726853749.16932: Calling groups_inventory to load vars for managed_node1 34350 1726853749.16935: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.16948: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.16951: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.16953: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.17495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.18287: done with get_vars() 34350 1726853749.18297: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 13:35:49 -0400 (0:00:00.037) 0:00:03.056 ****** 34350 1726853749.18401: entering _queue_task() for managed_node1/include_tasks 34350 1726853749.18854: worker is 1 (out of 1 available) 34350 1726853749.18874: exiting _queue_task() for managed_node1/include_tasks 34350 1726853749.18886: done queuing things up, now waiting for results queue to drain 34350 1726853749.18887: waiting for pending results... 34350 1726853749.19413: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 34350 1726853749.19446: in run() - task 02083763-bbaf-b6c1-0de4-00000000015e 34350 1726853749.19483: variable 'ansible_search_path' from source: unknown 34350 1726853749.19491: variable 'ansible_search_path' from source: unknown 34350 1726853749.19557: calling self._execute() 34350 1726853749.19696: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.19702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.19705: variable 'omit' from source: magic vars 34350 1726853749.20178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 34350 1726853749.25779: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 34350 1726853749.25811: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 34350 1726853749.25947: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 34350 1726853749.26222: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 34350 1726853749.26225: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 34350 1726853749.26228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 34350 1726853749.26344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 34350 1726853749.26379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 34350 1726853749.26470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 34350 1726853749.26564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 34350 1726853749.26799: variable '__network_is_ostree' from source: set_fact 34350 1726853749.26837: Evaluated conditional (not __network_is_ostree | d(false)): True 34350 1726853749.26884: _execute() done 34350 1726853749.26892: dumping result to json 34350 1726853749.26899: done dumping result, returning 34350 1726853749.27077: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [02083763-bbaf-b6c1-0de4-00000000015e] 34350 1726853749.27081: sending task result for task 02083763-bbaf-b6c1-0de4-00000000015e 34350 1726853749.27151: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000015e 34350 1726853749.27154: WORKER PROCESS EXITING 34350 1726853749.27187: no more pending results, returning what we have 34350 1726853749.27192: in VariableManager get_vars() 34350 1726853749.27230: Calling all_inventory to load vars for managed_node1 34350 1726853749.27234: Calling groups_inventory to load vars for managed_node1 34350 1726853749.27237: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.27249: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.27252: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.27255: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.27767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.28321: done with get_vars() 34350 1726853749.28330: variable 'ansible_search_path' from source: unknown 34350 1726853749.28332: variable 'ansible_search_path' from source: unknown 34350 1726853749.28494: we have included files to process 34350 1726853749.28495: generating all_blocks data 34350 1726853749.28497: done generating all_blocks data 34350 1726853749.28503: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34350 1726853749.28505: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34350 1726853749.28508: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 34350 1726853749.30331: done processing included file 34350 1726853749.30334: iterating over new_blocks loaded from include file 34350 1726853749.30335: in VariableManager get_vars() 34350 1726853749.30347: done with get_vars() 34350 1726853749.30349: filtering new block on tags 34350 1726853749.30370: done filtering new block on tags 34350 1726853749.30503: in VariableManager get_vars() 34350 1726853749.30553: done with get_vars() 34350 1726853749.30555: filtering new block on tags 34350 1726853749.30566: done filtering new block on tags 34350 1726853749.30568: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 34350 1726853749.30575: extending task lists for all hosts with included blocks 34350 1726853749.30861: done extending task lists 34350 1726853749.30862: done processing included files 34350 1726853749.30863: results queue empty 34350 1726853749.30864: checking for any_errors_fatal 34350 1726853749.30866: done checking for any_errors_fatal 34350 1726853749.30867: checking for max_fail_percentage 34350 1726853749.30868: done checking for max_fail_percentage 34350 1726853749.30869: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.30870: done checking to see if all hosts have failed 34350 1726853749.30870: getting the remaining hosts for this loop 34350 1726853749.30873: done getting the remaining hosts for this loop 34350 1726853749.30876: getting the next task for host managed_node1 34350 1726853749.30880: done getting next task for host managed_node1 34350 1726853749.30882: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 34350 1726853749.30885: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.30887: getting variables 34350 1726853749.30888: in VariableManager get_vars() 34350 1726853749.30896: Calling all_inventory to load vars for managed_node1 34350 1726853749.30898: Calling groups_inventory to load vars for managed_node1 34350 1726853749.30900: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.30905: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.30913: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.30915: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.31211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.31622: done with get_vars() 34350 1726853749.31630: done getting variables 34350 1726853749.31821: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 34350 1726853749.32376: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 13:35:49 -0400 (0:00:00.140) 0:00:03.196 ****** 34350 1726853749.32421: entering _queue_task() for managed_node1/command 34350 1726853749.32423: Creating lock for command 34350 1726853749.33156: worker is 1 (out of 1 available) 34350 1726853749.33168: exiting _queue_task() for managed_node1/command 34350 1726853749.33182: done queuing things up, now waiting for results queue to drain 34350 1726853749.33183: waiting for pending results... 34350 1726853749.33992: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 34350 1726853749.33997: in run() - task 02083763-bbaf-b6c1-0de4-000000000178 34350 1726853749.34211: variable 'ansible_search_path' from source: unknown 34350 1726853749.34214: variable 'ansible_search_path' from source: unknown 34350 1726853749.34216: calling self._execute() 34350 1726853749.34324: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.34385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.34410: variable 'omit' from source: magic vars 34350 1726853749.35209: variable 'ansible_distribution' from source: facts 34350 1726853749.35225: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34350 1726853749.35663: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.35677: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34350 1726853749.35794: when evaluation is False, skipping this task 34350 1726853749.36256: _execute() done 34350 1726853749.36260: dumping result to json 34350 1726853749.36264: done dumping result, returning 34350 1726853749.36267: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [02083763-bbaf-b6c1-0de4-000000000178] 34350 1726853749.36269: sending task result for task 02083763-bbaf-b6c1-0de4-000000000178 34350 1726853749.36348: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000178 34350 1726853749.36351: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34350 1726853749.36423: no more pending results, returning what we have 34350 1726853749.36427: results queue empty 34350 1726853749.36428: checking for any_errors_fatal 34350 1726853749.36429: done checking for any_errors_fatal 34350 1726853749.36430: checking for max_fail_percentage 34350 1726853749.36431: done checking for max_fail_percentage 34350 1726853749.36432: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.36433: done checking to see if all hosts have failed 34350 1726853749.36434: getting the remaining hosts for this loop 34350 1726853749.36435: done getting the remaining hosts for this loop 34350 1726853749.36439: getting the next task for host managed_node1 34350 1726853749.36447: done getting next task for host managed_node1 34350 1726853749.36450: ^ task is: TASK: Install yum-utils package 34350 1726853749.36453: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.36458: getting variables 34350 1726853749.36462: in VariableManager get_vars() 34350 1726853749.36497: Calling all_inventory to load vars for managed_node1 34350 1726853749.36500: Calling groups_inventory to load vars for managed_node1 34350 1726853749.36504: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.36516: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.36519: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.36522: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.37364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.37790: done with get_vars() 34350 1726853749.37799: done getting variables 34350 1726853749.38021: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 13:35:49 -0400 (0:00:00.056) 0:00:03.253 ****** 34350 1726853749.38049: entering _queue_task() for managed_node1/package 34350 1726853749.38051: Creating lock for package 34350 1726853749.38618: worker is 1 (out of 1 available) 34350 1726853749.38635: exiting _queue_task() for managed_node1/package 34350 1726853749.38872: done queuing things up, now waiting for results queue to drain 34350 1726853749.38874: waiting for pending results... 34350 1726853749.39535: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 34350 1726853749.39540: in run() - task 02083763-bbaf-b6c1-0de4-000000000179 34350 1726853749.39543: variable 'ansible_search_path' from source: unknown 34350 1726853749.39546: variable 'ansible_search_path' from source: unknown 34350 1726853749.39644: calling self._execute() 34350 1726853749.39959: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.40006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.40177: variable 'omit' from source: magic vars 34350 1726853749.40787: variable 'ansible_distribution' from source: facts 34350 1726853749.40935: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34350 1726853749.41399: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.41402: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34350 1726853749.41404: when evaluation is False, skipping this task 34350 1726853749.41406: _execute() done 34350 1726853749.41408: dumping result to json 34350 1726853749.41410: done dumping result, returning 34350 1726853749.41412: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [02083763-bbaf-b6c1-0de4-000000000179] 34350 1726853749.41414: sending task result for task 02083763-bbaf-b6c1-0de4-000000000179 34350 1726853749.41618: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000179 34350 1726853749.41621: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34350 1726853749.41888: no more pending results, returning what we have 34350 1726853749.41892: results queue empty 34350 1726853749.41893: checking for any_errors_fatal 34350 1726853749.41896: done checking for any_errors_fatal 34350 1726853749.41897: checking for max_fail_percentage 34350 1726853749.41898: done checking for max_fail_percentage 34350 1726853749.41899: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.41899: done checking to see if all hosts have failed 34350 1726853749.41900: getting the remaining hosts for this loop 34350 1726853749.41902: done getting the remaining hosts for this loop 34350 1726853749.41906: getting the next task for host managed_node1 34350 1726853749.41913: done getting next task for host managed_node1 34350 1726853749.41915: ^ task is: TASK: Enable EPEL 7 34350 1726853749.41919: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.41922: getting variables 34350 1726853749.41923: in VariableManager get_vars() 34350 1726853749.41948: Calling all_inventory to load vars for managed_node1 34350 1726853749.41950: Calling groups_inventory to load vars for managed_node1 34350 1726853749.41954: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.41966: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.41969: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.42130: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.42538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.42910: done with get_vars() 34350 1726853749.42920: done getting variables 34350 1726853749.43214: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 13:35:49 -0400 (0:00:00.051) 0:00:03.305 ****** 34350 1726853749.43242: entering _queue_task() for managed_node1/command 34350 1726853749.43801: worker is 1 (out of 1 available) 34350 1726853749.43964: exiting _queue_task() for managed_node1/command 34350 1726853749.43976: done queuing things up, now waiting for results queue to drain 34350 1726853749.43977: waiting for pending results... 34350 1726853749.44654: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 34350 1726853749.44797: in run() - task 02083763-bbaf-b6c1-0de4-00000000017a 34350 1726853749.44815: variable 'ansible_search_path' from source: unknown 34350 1726853749.44822: variable 'ansible_search_path' from source: unknown 34350 1726853749.44891: calling self._execute() 34350 1726853749.45117: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.45121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.45123: variable 'omit' from source: magic vars 34350 1726853749.45987: variable 'ansible_distribution' from source: facts 34350 1726853749.45995: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34350 1726853749.46117: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.46129: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34350 1726853749.46139: when evaluation is False, skipping this task 34350 1726853749.46148: _execute() done 34350 1726853749.46156: dumping result to json 34350 1726853749.46165: done dumping result, returning 34350 1726853749.46179: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [02083763-bbaf-b6c1-0de4-00000000017a] 34350 1726853749.46208: sending task result for task 02083763-bbaf-b6c1-0de4-00000000017a 34350 1726853749.46367: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000017a 34350 1726853749.46374: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34350 1726853749.46430: no more pending results, returning what we have 34350 1726853749.46433: results queue empty 34350 1726853749.46434: checking for any_errors_fatal 34350 1726853749.46441: done checking for any_errors_fatal 34350 1726853749.46442: checking for max_fail_percentage 34350 1726853749.46443: done checking for max_fail_percentage 34350 1726853749.46444: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.46445: done checking to see if all hosts have failed 34350 1726853749.46445: getting the remaining hosts for this loop 34350 1726853749.46447: done getting the remaining hosts for this loop 34350 1726853749.46450: getting the next task for host managed_node1 34350 1726853749.46456: done getting next task for host managed_node1 34350 1726853749.46459: ^ task is: TASK: Enable EPEL 8 34350 1726853749.46462: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.46465: getting variables 34350 1726853749.46466: in VariableManager get_vars() 34350 1726853749.46493: Calling all_inventory to load vars for managed_node1 34350 1726853749.46495: Calling groups_inventory to load vars for managed_node1 34350 1726853749.46497: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.46506: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.46509: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.46511: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.46646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.46796: done with get_vars() 34350 1726853749.46803: done getting variables 34350 1726853749.46843: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 13:35:49 -0400 (0:00:00.036) 0:00:03.341 ****** 34350 1726853749.46866: entering _queue_task() for managed_node1/command 34350 1726853749.47058: worker is 1 (out of 1 available) 34350 1726853749.47070: exiting _queue_task() for managed_node1/command 34350 1726853749.47082: done queuing things up, now waiting for results queue to drain 34350 1726853749.47083: waiting for pending results... 34350 1726853749.47229: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 34350 1726853749.47292: in run() - task 02083763-bbaf-b6c1-0de4-00000000017b 34350 1726853749.47302: variable 'ansible_search_path' from source: unknown 34350 1726853749.47307: variable 'ansible_search_path' from source: unknown 34350 1726853749.47334: calling self._execute() 34350 1726853749.47392: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.47396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.47404: variable 'omit' from source: magic vars 34350 1726853749.47673: variable 'ansible_distribution' from source: facts 34350 1726853749.47683: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34350 1726853749.47768: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.47773: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 34350 1726853749.47776: when evaluation is False, skipping this task 34350 1726853749.47779: _execute() done 34350 1726853749.47782: dumping result to json 34350 1726853749.47786: done dumping result, returning 34350 1726853749.47793: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [02083763-bbaf-b6c1-0de4-00000000017b] 34350 1726853749.47797: sending task result for task 02083763-bbaf-b6c1-0de4-00000000017b 34350 1726853749.47909: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000017b 34350 1726853749.47912: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 34350 1726853749.48040: no more pending results, returning what we have 34350 1726853749.48043: results queue empty 34350 1726853749.48044: checking for any_errors_fatal 34350 1726853749.48048: done checking for any_errors_fatal 34350 1726853749.48049: checking for max_fail_percentage 34350 1726853749.48050: done checking for max_fail_percentage 34350 1726853749.48051: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.48052: done checking to see if all hosts have failed 34350 1726853749.48052: getting the remaining hosts for this loop 34350 1726853749.48053: done getting the remaining hosts for this loop 34350 1726853749.48057: getting the next task for host managed_node1 34350 1726853749.48067: done getting next task for host managed_node1 34350 1726853749.48070: ^ task is: TASK: Enable EPEL 6 34350 1726853749.48075: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.48079: getting variables 34350 1726853749.48080: in VariableManager get_vars() 34350 1726853749.48102: Calling all_inventory to load vars for managed_node1 34350 1726853749.48105: Calling groups_inventory to load vars for managed_node1 34350 1726853749.48107: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.48116: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.48118: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.48121: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.48286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.48489: done with get_vars() 34350 1726853749.48498: done getting variables 34350 1726853749.48552: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 13:35:49 -0400 (0:00:00.017) 0:00:03.358 ****** 34350 1726853749.48584: entering _queue_task() for managed_node1/copy 34350 1726853749.48805: worker is 1 (out of 1 available) 34350 1726853749.48817: exiting _queue_task() for managed_node1/copy 34350 1726853749.48828: done queuing things up, now waiting for results queue to drain 34350 1726853749.48829: waiting for pending results... 34350 1726853749.49176: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 34350 1726853749.49181: in run() - task 02083763-bbaf-b6c1-0de4-00000000017d 34350 1726853749.49191: variable 'ansible_search_path' from source: unknown 34350 1726853749.49198: variable 'ansible_search_path' from source: unknown 34350 1726853749.49235: calling self._execute() 34350 1726853749.49315: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.49327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.49340: variable 'omit' from source: magic vars 34350 1726853749.49736: variable 'ansible_distribution' from source: facts 34350 1726853749.49762: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 34350 1726853749.49861: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.49918: Evaluated conditional (ansible_distribution_major_version == '6'): False 34350 1726853749.49922: when evaluation is False, skipping this task 34350 1726853749.49924: _execute() done 34350 1726853749.49927: dumping result to json 34350 1726853749.49929: done dumping result, returning 34350 1726853749.49931: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [02083763-bbaf-b6c1-0de4-00000000017d] 34350 1726853749.49933: sending task result for task 02083763-bbaf-b6c1-0de4-00000000017d skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 34350 1726853749.50207: no more pending results, returning what we have 34350 1726853749.50210: results queue empty 34350 1726853749.50211: checking for any_errors_fatal 34350 1726853749.50215: done checking for any_errors_fatal 34350 1726853749.50216: checking for max_fail_percentage 34350 1726853749.50217: done checking for max_fail_percentage 34350 1726853749.50218: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.50219: done checking to see if all hosts have failed 34350 1726853749.50219: getting the remaining hosts for this loop 34350 1726853749.50221: done getting the remaining hosts for this loop 34350 1726853749.50224: getting the next task for host managed_node1 34350 1726853749.50231: done getting next task for host managed_node1 34350 1726853749.50233: ^ task is: TASK: Set network provider to 'nm' 34350 1726853749.50236: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.50239: getting variables 34350 1726853749.50241: in VariableManager get_vars() 34350 1726853749.50268: Calling all_inventory to load vars for managed_node1 34350 1726853749.50273: Calling groups_inventory to load vars for managed_node1 34350 1726853749.50276: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.50285: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.50288: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.50291: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.50535: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000017d 34350 1726853749.50539: WORKER PROCESS EXITING 34350 1726853749.50565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.50751: done with get_vars() 34350 1726853749.50761: done getting variables 34350 1726853749.50802: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:13 Friday 20 September 2024 13:35:49 -0400 (0:00:00.022) 0:00:03.380 ****** 34350 1726853749.50820: entering _queue_task() for managed_node1/set_fact 34350 1726853749.51000: worker is 1 (out of 1 available) 34350 1726853749.51012: exiting _queue_task() for managed_node1/set_fact 34350 1726853749.51022: done queuing things up, now waiting for results queue to drain 34350 1726853749.51023: waiting for pending results... 34350 1726853749.51184: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 34350 1726853749.51236: in run() - task 02083763-bbaf-b6c1-0de4-000000000007 34350 1726853749.51248: variable 'ansible_search_path' from source: unknown 34350 1726853749.51277: calling self._execute() 34350 1726853749.51337: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.51340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.51349: variable 'omit' from source: magic vars 34350 1726853749.51436: variable 'omit' from source: magic vars 34350 1726853749.51460: variable 'omit' from source: magic vars 34350 1726853749.51486: variable 'omit' from source: magic vars 34350 1726853749.51520: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 34350 1726853749.51546: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 34350 1726853749.51564: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 34350 1726853749.51579: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34350 1726853749.51589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 34350 1726853749.51611: variable 'inventory_hostname' from source: host vars for 'managed_node1' 34350 1726853749.51615: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.51618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.51690: Set connection var ansible_timeout to 10 34350 1726853749.51696: Set connection var ansible_module_compression to ZIP_DEFLATED 34350 1726853749.51702: Set connection var ansible_pipelining to False 34350 1726853749.51707: Set connection var ansible_shell_executable to /bin/sh 34350 1726853749.51715: Set connection var ansible_connection to ssh 34350 1726853749.51717: Set connection var ansible_shell_type to sh 34350 1726853749.51736: variable 'ansible_shell_executable' from source: unknown 34350 1726853749.51739: variable 'ansible_connection' from source: unknown 34350 1726853749.51741: variable 'ansible_module_compression' from source: unknown 34350 1726853749.51744: variable 'ansible_shell_type' from source: unknown 34350 1726853749.51746: variable 'ansible_shell_executable' from source: unknown 34350 1726853749.51749: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.51751: variable 'ansible_pipelining' from source: unknown 34350 1726853749.51753: variable 'ansible_timeout' from source: unknown 34350 1726853749.51755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.51862: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 34350 1726853749.51869: variable 'omit' from source: magic vars 34350 1726853749.51876: starting attempt loop 34350 1726853749.51879: running the handler 34350 1726853749.51888: handler run complete 34350 1726853749.51898: attempt loop complete, returning result 34350 1726853749.51901: _execute() done 34350 1726853749.51903: dumping result to json 34350 1726853749.51906: done dumping result, returning 34350 1726853749.51912: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [02083763-bbaf-b6c1-0de4-000000000007] 34350 1726853749.51916: sending task result for task 02083763-bbaf-b6c1-0de4-000000000007 34350 1726853749.52005: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000007 34350 1726853749.52008: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 34350 1726853749.52073: no more pending results, returning what we have 34350 1726853749.52075: results queue empty 34350 1726853749.52076: checking for any_errors_fatal 34350 1726853749.52081: done checking for any_errors_fatal 34350 1726853749.52081: checking for max_fail_percentage 34350 1726853749.52083: done checking for max_fail_percentage 34350 1726853749.52083: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.52084: done checking to see if all hosts have failed 34350 1726853749.52085: getting the remaining hosts for this loop 34350 1726853749.52086: done getting the remaining hosts for this loop 34350 1726853749.52089: getting the next task for host managed_node1 34350 1726853749.52096: done getting next task for host managed_node1 34350 1726853749.52097: ^ task is: TASK: meta (flush_handlers) 34350 1726853749.52099: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.52124: getting variables 34350 1726853749.52126: in VariableManager get_vars() 34350 1726853749.52148: Calling all_inventory to load vars for managed_node1 34350 1726853749.52151: Calling groups_inventory to load vars for managed_node1 34350 1726853749.52153: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.52161: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.52163: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.52166: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.52344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.52553: done with get_vars() 34350 1726853749.52563: done getting variables 34350 1726853749.52641: in VariableManager get_vars() 34350 1726853749.52652: Calling all_inventory to load vars for managed_node1 34350 1726853749.52654: Calling groups_inventory to load vars for managed_node1 34350 1726853749.52656: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.52663: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.52665: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.52673: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.52828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.53063: done with get_vars() 34350 1726853749.53079: done queuing things up, now waiting for results queue to drain 34350 1726853749.53081: results queue empty 34350 1726853749.53082: checking for any_errors_fatal 34350 1726853749.53084: done checking for any_errors_fatal 34350 1726853749.53084: checking for max_fail_percentage 34350 1726853749.53085: done checking for max_fail_percentage 34350 1726853749.53086: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.53087: done checking to see if all hosts have failed 34350 1726853749.53088: getting the remaining hosts for this loop 34350 1726853749.53089: done getting the remaining hosts for this loop 34350 1726853749.53091: getting the next task for host managed_node1 34350 1726853749.53106: done getting next task for host managed_node1 34350 1726853749.53108: ^ task is: TASK: meta (flush_handlers) 34350 1726853749.53110: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.53117: getting variables 34350 1726853749.53118: in VariableManager get_vars() 34350 1726853749.53125: Calling all_inventory to load vars for managed_node1 34350 1726853749.53127: Calling groups_inventory to load vars for managed_node1 34350 1726853749.53129: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.53133: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.53135: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.53138: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.53350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.53741: done with get_vars() 34350 1726853749.53763: done getting variables 34350 1726853749.53809: in VariableManager get_vars() 34350 1726853749.53817: Calling all_inventory to load vars for managed_node1 34350 1726853749.53819: Calling groups_inventory to load vars for managed_node1 34350 1726853749.53821: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.53825: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.53827: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.53830: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.53990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.54211: done with get_vars() 34350 1726853749.54222: done queuing things up, now waiting for results queue to drain 34350 1726853749.54223: results queue empty 34350 1726853749.54224: checking for any_errors_fatal 34350 1726853749.54225: done checking for any_errors_fatal 34350 1726853749.54226: checking for max_fail_percentage 34350 1726853749.54227: done checking for max_fail_percentage 34350 1726853749.54227: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.54228: done checking to see if all hosts have failed 34350 1726853749.54229: getting the remaining hosts for this loop 34350 1726853749.54230: done getting the remaining hosts for this loop 34350 1726853749.54232: getting the next task for host managed_node1 34350 1726853749.54236: done getting next task for host managed_node1 34350 1726853749.54237: ^ task is: None 34350 1726853749.54238: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.54239: done queuing things up, now waiting for results queue to drain 34350 1726853749.54240: results queue empty 34350 1726853749.54241: checking for any_errors_fatal 34350 1726853749.54242: done checking for any_errors_fatal 34350 1726853749.54242: checking for max_fail_percentage 34350 1726853749.54243: done checking for max_fail_percentage 34350 1726853749.54244: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.54244: done checking to see if all hosts have failed 34350 1726853749.54246: getting the next task for host managed_node1 34350 1726853749.54248: done getting next task for host managed_node1 34350 1726853749.54249: ^ task is: None 34350 1726853749.54251: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.54300: in VariableManager get_vars() 34350 1726853749.54331: done with get_vars() 34350 1726853749.54338: in VariableManager get_vars() 34350 1726853749.54359: done with get_vars() 34350 1726853749.54363: variable 'omit' from source: magic vars 34350 1726853749.54385: in VariableManager get_vars() 34350 1726853749.54397: done with get_vars() 34350 1726853749.54415: variable 'omit' from source: magic vars PLAY [Play for testing wireless connection] ************************************ 34350 1726853749.54873: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 34350 1726853749.54894: getting the remaining hosts for this loop 34350 1726853749.54895: done getting the remaining hosts for this loop 34350 1726853749.54897: getting the next task for host managed_node1 34350 1726853749.54898: done getting next task for host managed_node1 34350 1726853749.54900: ^ task is: TASK: Gathering Facts 34350 1726853749.54901: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.54902: getting variables 34350 1726853749.54902: in VariableManager get_vars() 34350 1726853749.54912: Calling all_inventory to load vars for managed_node1 34350 1726853749.54914: Calling groups_inventory to load vars for managed_node1 34350 1726853749.54915: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.54918: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.54926: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.54928: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.55011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.55118: done with get_vars() 34350 1726853749.55123: done getting variables 34350 1726853749.55148: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:3 Friday 20 September 2024 13:35:49 -0400 (0:00:00.043) 0:00:03.424 ****** 34350 1726853749.55166: entering _queue_task() for managed_node1/gather_facts 34350 1726853749.55366: worker is 1 (out of 1 available) 34350 1726853749.55379: exiting _queue_task() for managed_node1/gather_facts 34350 1726853749.55389: done queuing things up, now waiting for results queue to drain 34350 1726853749.55390: waiting for pending results... 34350 1726853749.55534: running TaskExecutor() for managed_node1/TASK: Gathering Facts 34350 1726853749.55599: in run() - task 02083763-bbaf-b6c1-0de4-0000000001a3 34350 1726853749.55612: variable 'ansible_search_path' from source: unknown 34350 1726853749.55642: calling self._execute() 34350 1726853749.55703: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.55707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.55715: variable 'omit' from source: magic vars 34350 1726853749.55984: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.55993: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.56102: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.56106: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.56108: when evaluation is False, skipping this task 34350 1726853749.56112: _execute() done 34350 1726853749.56115: dumping result to json 34350 1726853749.56119: done dumping result, returning 34350 1726853749.56131: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-b6c1-0de4-0000000001a3] 34350 1726853749.56134: sending task result for task 02083763-bbaf-b6c1-0de4-0000000001a3 34350 1726853749.56230: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000001a3 34350 1726853749.56234: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853749.56290: no more pending results, returning what we have 34350 1726853749.56298: results queue empty 34350 1726853749.56299: checking for any_errors_fatal 34350 1726853749.56301: done checking for any_errors_fatal 34350 1726853749.56301: checking for max_fail_percentage 34350 1726853749.56303: done checking for max_fail_percentage 34350 1726853749.56304: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.56304: done checking to see if all hosts have failed 34350 1726853749.56305: getting the remaining hosts for this loop 34350 1726853749.56307: done getting the remaining hosts for this loop 34350 1726853749.56310: getting the next task for host managed_node1 34350 1726853749.56317: done getting next task for host managed_node1 34350 1726853749.56319: ^ task is: TASK: meta (flush_handlers) 34350 1726853749.56321: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.56324: getting variables 34350 1726853749.56326: in VariableManager get_vars() 34350 1726853749.56374: Calling all_inventory to load vars for managed_node1 34350 1726853749.56377: Calling groups_inventory to load vars for managed_node1 34350 1726853749.56379: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.56389: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.56392: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.56394: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.56579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.56767: done with get_vars() 34350 1726853749.56778: done getting variables 34350 1726853749.56833: in VariableManager get_vars() 34350 1726853749.56852: Calling all_inventory to load vars for managed_node1 34350 1726853749.56856: Calling groups_inventory to load vars for managed_node1 34350 1726853749.56859: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.56863: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.56865: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.56867: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.57005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.57194: done with get_vars() 34350 1726853749.57205: done queuing things up, now waiting for results queue to drain 34350 1726853749.57206: results queue empty 34350 1726853749.57207: checking for any_errors_fatal 34350 1726853749.57209: done checking for any_errors_fatal 34350 1726853749.57210: checking for max_fail_percentage 34350 1726853749.57210: done checking for max_fail_percentage 34350 1726853749.57211: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.57212: done checking to see if all hosts have failed 34350 1726853749.57212: getting the remaining hosts for this loop 34350 1726853749.57213: done getting the remaining hosts for this loop 34350 1726853749.57215: getting the next task for host managed_node1 34350 1726853749.57218: done getting next task for host managed_node1 34350 1726853749.57220: ^ task is: TASK: INIT: wireless tests 34350 1726853749.57222: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.57223: getting variables 34350 1726853749.57224: in VariableManager get_vars() 34350 1726853749.57244: Calling all_inventory to load vars for managed_node1 34350 1726853749.57246: Calling groups_inventory to load vars for managed_node1 34350 1726853749.57248: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.57252: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.57254: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.57256: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.57399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.57599: done with get_vars() 34350 1726853749.57607: done getting variables 34350 1726853749.57677: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT: wireless tests] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:8 Friday 20 September 2024 13:35:49 -0400 (0:00:00.025) 0:00:03.449 ****** 34350 1726853749.57694: entering _queue_task() for managed_node1/debug 34350 1726853749.57695: Creating lock for debug 34350 1726853749.57916: worker is 1 (out of 1 available) 34350 1726853749.57929: exiting _queue_task() for managed_node1/debug 34350 1726853749.57940: done queuing things up, now waiting for results queue to drain 34350 1726853749.57941: waiting for pending results... 34350 1726853749.58289: running TaskExecutor() for managed_node1/TASK: INIT: wireless tests 34350 1726853749.58294: in run() - task 02083763-bbaf-b6c1-0de4-00000000000b 34350 1726853749.58297: variable 'ansible_search_path' from source: unknown 34350 1726853749.58300: calling self._execute() 34350 1726853749.58349: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.58363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.58382: variable 'omit' from source: magic vars 34350 1726853749.58753: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.58775: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.58891: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.58901: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.58908: when evaluation is False, skipping this task 34350 1726853749.58915: _execute() done 34350 1726853749.58922: dumping result to json 34350 1726853749.58933: done dumping result, returning 34350 1726853749.58944: done running TaskExecutor() for managed_node1/TASK: INIT: wireless tests [02083763-bbaf-b6c1-0de4-00000000000b] 34350 1726853749.58953: sending task result for task 02083763-bbaf-b6c1-0de4-00000000000b 34350 1726853749.59138: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000000b 34350 1726853749.59140: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853749.59193: no more pending results, returning what we have 34350 1726853749.59197: results queue empty 34350 1726853749.59198: checking for any_errors_fatal 34350 1726853749.59201: done checking for any_errors_fatal 34350 1726853749.59202: checking for max_fail_percentage 34350 1726853749.59203: done checking for max_fail_percentage 34350 1726853749.59204: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.59204: done checking to see if all hosts have failed 34350 1726853749.59205: getting the remaining hosts for this loop 34350 1726853749.59207: done getting the remaining hosts for this loop 34350 1726853749.59210: getting the next task for host managed_node1 34350 1726853749.59216: done getting next task for host managed_node1 34350 1726853749.59219: ^ task is: TASK: Include the task 'setup_mock_wifi.yml' 34350 1726853749.59221: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.59224: getting variables 34350 1726853749.59225: in VariableManager get_vars() 34350 1726853749.59262: Calling all_inventory to load vars for managed_node1 34350 1726853749.59264: Calling groups_inventory to load vars for managed_node1 34350 1726853749.59266: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.59276: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.59278: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.59280: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.59394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.59507: done with get_vars() 34350 1726853749.59514: done getting variables TASK [Include the task 'setup_mock_wifi.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:11 Friday 20 September 2024 13:35:49 -0400 (0:00:00.018) 0:00:03.468 ****** 34350 1726853749.59573: entering _queue_task() for managed_node1/include_tasks 34350 1726853749.59748: worker is 1 (out of 1 available) 34350 1726853749.59760: exiting _queue_task() for managed_node1/include_tasks 34350 1726853749.59770: done queuing things up, now waiting for results queue to drain 34350 1726853749.59773: waiting for pending results... 34350 1726853749.59910: running TaskExecutor() for managed_node1/TASK: Include the task 'setup_mock_wifi.yml' 34350 1726853749.59965: in run() - task 02083763-bbaf-b6c1-0de4-00000000000c 34350 1726853749.59974: variable 'ansible_search_path' from source: unknown 34350 1726853749.60007: calling self._execute() 34350 1726853749.60062: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.60066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.60074: variable 'omit' from source: magic vars 34350 1726853749.60317: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.60327: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.60402: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.60406: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.60408: when evaluation is False, skipping this task 34350 1726853749.60412: _execute() done 34350 1726853749.60414: dumping result to json 34350 1726853749.60419: done dumping result, returning 34350 1726853749.60425: done running TaskExecutor() for managed_node1/TASK: Include the task 'setup_mock_wifi.yml' [02083763-bbaf-b6c1-0de4-00000000000c] 34350 1726853749.60430: sending task result for task 02083763-bbaf-b6c1-0de4-00000000000c 34350 1726853749.60512: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000000c 34350 1726853749.60515: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853749.60581: no more pending results, returning what we have 34350 1726853749.60584: results queue empty 34350 1726853749.60585: checking for any_errors_fatal 34350 1726853749.60590: done checking for any_errors_fatal 34350 1726853749.60590: checking for max_fail_percentage 34350 1726853749.60592: done checking for max_fail_percentage 34350 1726853749.60592: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.60593: done checking to see if all hosts have failed 34350 1726853749.60594: getting the remaining hosts for this loop 34350 1726853749.60595: done getting the remaining hosts for this loop 34350 1726853749.60598: getting the next task for host managed_node1 34350 1726853749.60602: done getting next task for host managed_node1 34350 1726853749.60604: ^ task is: TASK: Copy client certs 34350 1726853749.60606: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.60609: getting variables 34350 1726853749.60610: in VariableManager get_vars() 34350 1726853749.60639: Calling all_inventory to load vars for managed_node1 34350 1726853749.60641: Calling groups_inventory to load vars for managed_node1 34350 1726853749.60643: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.60650: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.60653: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.60655: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.60786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.60897: done with get_vars() 34350 1726853749.60904: done getting variables 34350 1726853749.60940: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Copy client certs] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:13 Friday 20 September 2024 13:35:49 -0400 (0:00:00.013) 0:00:03.482 ****** 34350 1726853749.60961: entering _queue_task() for managed_node1/copy 34350 1726853749.61402: worker is 1 (out of 1 available) 34350 1726853749.61409: exiting _queue_task() for managed_node1/copy 34350 1726853749.61418: done queuing things up, now waiting for results queue to drain 34350 1726853749.61419: waiting for pending results... 34350 1726853749.61544: running TaskExecutor() for managed_node1/TASK: Copy client certs 34350 1726853749.61549: in run() - task 02083763-bbaf-b6c1-0de4-00000000000d 34350 1726853749.61554: variable 'ansible_search_path' from source: unknown 34350 1726853749.61786: Loaded config def from plugin (lookup/items) 34350 1726853749.61799: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 34350 1726853749.61850: variable 'omit' from source: magic vars 34350 1726853749.61985: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.61998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.62013: variable 'omit' from source: magic vars 34350 1726853749.62316: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.62323: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.62465: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.62472: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.62475: when evaluation is False, skipping this task 34350 1726853749.62479: variable 'item' from source: unknown 34350 1726853749.62491: variable 'item' from source: unknown skipping: [managed_node1] => (item=client.key) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "client.key", "skip_reason": "Conditional result was False" } 34350 1726853749.62614: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.62618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.62620: variable 'omit' from source: magic vars 34350 1726853749.62696: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.62699: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.62775: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.62778: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.62781: when evaluation is False, skipping this task 34350 1726853749.62799: variable 'item' from source: unknown 34350 1726853749.62841: variable 'item' from source: unknown skipping: [managed_node1] => (item=client.pem) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "client.pem", "skip_reason": "Conditional result was False" } 34350 1726853749.62912: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.62915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.62919: variable 'omit' from source: magic vars 34350 1726853749.63020: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.63023: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.63100: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.63104: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.63106: when evaluation is False, skipping this task 34350 1726853749.63122: variable 'item' from source: unknown 34350 1726853749.63162: variable 'item' from source: unknown skipping: [managed_node1] => (item=cacert.pem) => { "ansible_loop_var": "item", "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "item": "cacert.pem", "skip_reason": "Conditional result was False" } 34350 1726853749.63235: dumping result to json 34350 1726853749.63238: done dumping result, returning 34350 1726853749.63240: done running TaskExecutor() for managed_node1/TASK: Copy client certs [02083763-bbaf-b6c1-0de4-00000000000d] 34350 1726853749.63241: sending task result for task 02083763-bbaf-b6c1-0de4-00000000000d 34350 1726853749.63275: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000000d 34350 1726853749.63277: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false } MSG: All items skipped 34350 1726853749.63316: no more pending results, returning what we have 34350 1726853749.63319: results queue empty 34350 1726853749.63320: checking for any_errors_fatal 34350 1726853749.63323: done checking for any_errors_fatal 34350 1726853749.63324: checking for max_fail_percentage 34350 1726853749.63325: done checking for max_fail_percentage 34350 1726853749.63326: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.63326: done checking to see if all hosts have failed 34350 1726853749.63327: getting the remaining hosts for this loop 34350 1726853749.63329: done getting the remaining hosts for this loop 34350 1726853749.63332: getting the next task for host managed_node1 34350 1726853749.63339: done getting next task for host managed_node1 34350 1726853749.63341: ^ task is: TASK: TEST: wireless connection with WPA-PSK 34350 1726853749.63343: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.63346: getting variables 34350 1726853749.63348: in VariableManager get_vars() 34350 1726853749.63394: Calling all_inventory to load vars for managed_node1 34350 1726853749.63396: Calling groups_inventory to load vars for managed_node1 34350 1726853749.63399: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.63408: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.63410: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.63413: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.63541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.63678: done with get_vars() 34350 1726853749.63685: done getting variables 34350 1726853749.63726: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: wireless connection with WPA-PSK] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:24 Friday 20 September 2024 13:35:49 -0400 (0:00:00.027) 0:00:03.510 ****** 34350 1726853749.63743: entering _queue_task() for managed_node1/debug 34350 1726853749.63927: worker is 1 (out of 1 available) 34350 1726853749.63939: exiting _queue_task() for managed_node1/debug 34350 1726853749.63950: done queuing things up, now waiting for results queue to drain 34350 1726853749.63951: waiting for pending results... 34350 1726853749.64099: running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with WPA-PSK 34350 1726853749.64276: in run() - task 02083763-bbaf-b6c1-0de4-00000000000f 34350 1726853749.64281: variable 'ansible_search_path' from source: unknown 34350 1726853749.64283: calling self._execute() 34350 1726853749.64302: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.64313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.64327: variable 'omit' from source: magic vars 34350 1726853749.64674: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.64694: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.64810: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.64823: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.64832: when evaluation is False, skipping this task 34350 1726853749.64840: _execute() done 34350 1726853749.64847: dumping result to json 34350 1726853749.64855: done dumping result, returning 34350 1726853749.64867: done running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with WPA-PSK [02083763-bbaf-b6c1-0de4-00000000000f] 34350 1726853749.64881: sending task result for task 02083763-bbaf-b6c1-0de4-00000000000f skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853749.65042: no more pending results, returning what we have 34350 1726853749.65045: results queue empty 34350 1726853749.65046: checking for any_errors_fatal 34350 1726853749.65053: done checking for any_errors_fatal 34350 1726853749.65054: checking for max_fail_percentage 34350 1726853749.65055: done checking for max_fail_percentage 34350 1726853749.65056: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.65057: done checking to see if all hosts have failed 34350 1726853749.65057: getting the remaining hosts for this loop 34350 1726853749.65061: done getting the remaining hosts for this loop 34350 1726853749.65065: getting the next task for host managed_node1 34350 1726853749.65075: done getting next task for host managed_node1 34350 1726853749.65080: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34350 1726853749.65083: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.65099: getting variables 34350 1726853749.65193: in VariableManager get_vars() 34350 1726853749.65235: Calling all_inventory to load vars for managed_node1 34350 1726853749.65238: Calling groups_inventory to load vars for managed_node1 34350 1726853749.65240: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.65246: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000000f 34350 1726853749.65249: WORKER PROCESS EXITING 34350 1726853749.65258: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.65261: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.65264: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.65427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.65625: done with get_vars() 34350 1726853749.65635: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:35:49 -0400 (0:00:00.019) 0:00:03.530 ****** 34350 1726853749.65727: entering _queue_task() for managed_node1/include_tasks 34350 1726853749.65951: worker is 1 (out of 1 available) 34350 1726853749.65963: exiting _queue_task() for managed_node1/include_tasks 34350 1726853749.65974: done queuing things up, now waiting for results queue to drain 34350 1726853749.65976: waiting for pending results... 34350 1726853749.66173: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34350 1726853749.66303: in run() - task 02083763-bbaf-b6c1-0de4-000000000017 34350 1726853749.66323: variable 'ansible_search_path' from source: unknown 34350 1726853749.66331: variable 'ansible_search_path' from source: unknown 34350 1726853749.66369: calling self._execute() 34350 1726853749.66455: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.66466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.66489: variable 'omit' from source: magic vars 34350 1726853749.66851: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.66869: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.66994: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.67006: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.67014: when evaluation is False, skipping this task 34350 1726853749.67023: _execute() done 34350 1726853749.67030: dumping result to json 34350 1726853749.67040: done dumping result, returning 34350 1726853749.67057: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-b6c1-0de4-000000000017] 34350 1726853749.67068: sending task result for task 02083763-bbaf-b6c1-0de4-000000000017 34350 1726853749.67277: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000017 34350 1726853749.67281: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853749.67327: no more pending results, returning what we have 34350 1726853749.67332: results queue empty 34350 1726853749.67333: checking for any_errors_fatal 34350 1726853749.67340: done checking for any_errors_fatal 34350 1726853749.67341: checking for max_fail_percentage 34350 1726853749.67343: done checking for max_fail_percentage 34350 1726853749.67344: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.67345: done checking to see if all hosts have failed 34350 1726853749.67345: getting the remaining hosts for this loop 34350 1726853749.67347: done getting the remaining hosts for this loop 34350 1726853749.67351: getting the next task for host managed_node1 34350 1726853749.67358: done getting next task for host managed_node1 34350 1726853749.67362: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34350 1726853749.67365: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.67383: getting variables 34350 1726853749.67385: in VariableManager get_vars() 34350 1726853749.67438: Calling all_inventory to load vars for managed_node1 34350 1726853749.67441: Calling groups_inventory to load vars for managed_node1 34350 1726853749.67444: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.67455: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.67459: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.67462: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.67955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.68148: done with get_vars() 34350 1726853749.68157: done getting variables 34350 1726853749.68216: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:35:49 -0400 (0:00:00.025) 0:00:03.555 ****** 34350 1726853749.68248: entering _queue_task() for managed_node1/debug 34350 1726853749.68502: worker is 1 (out of 1 available) 34350 1726853749.68516: exiting _queue_task() for managed_node1/debug 34350 1726853749.68528: done queuing things up, now waiting for results queue to drain 34350 1726853749.68529: waiting for pending results... 34350 1726853749.68788: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 34350 1726853749.68928: in run() - task 02083763-bbaf-b6c1-0de4-000000000018 34350 1726853749.68949: variable 'ansible_search_path' from source: unknown 34350 1726853749.68958: variable 'ansible_search_path' from source: unknown 34350 1726853749.69002: calling self._execute() 34350 1726853749.69093: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.69107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.69126: variable 'omit' from source: magic vars 34350 1726853749.69462: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.69469: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.69548: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.69553: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.69556: when evaluation is False, skipping this task 34350 1726853749.69561: _execute() done 34350 1726853749.69564: dumping result to json 34350 1726853749.69567: done dumping result, returning 34350 1726853749.69574: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-b6c1-0de4-000000000018] 34350 1726853749.69577: sending task result for task 02083763-bbaf-b6c1-0de4-000000000018 34350 1726853749.69667: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000018 34350 1726853749.69670: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853749.69713: no more pending results, returning what we have 34350 1726853749.69717: results queue empty 34350 1726853749.69717: checking for any_errors_fatal 34350 1726853749.69722: done checking for any_errors_fatal 34350 1726853749.69723: checking for max_fail_percentage 34350 1726853749.69724: done checking for max_fail_percentage 34350 1726853749.69725: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.69726: done checking to see if all hosts have failed 34350 1726853749.69727: getting the remaining hosts for this loop 34350 1726853749.69728: done getting the remaining hosts for this loop 34350 1726853749.69731: getting the next task for host managed_node1 34350 1726853749.69746: done getting next task for host managed_node1 34350 1726853749.69750: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34350 1726853749.69753: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.69769: getting variables 34350 1726853749.69772: in VariableManager get_vars() 34350 1726853749.69810: Calling all_inventory to load vars for managed_node1 34350 1726853749.69812: Calling groups_inventory to load vars for managed_node1 34350 1726853749.69814: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.69822: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.69824: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.69826: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.69941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.70070: done with get_vars() 34350 1726853749.70080: done getting variables 34350 1726853749.70139: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:35:49 -0400 (0:00:00.019) 0:00:03.574 ****** 34350 1726853749.70163: entering _queue_task() for managed_node1/fail 34350 1726853749.70164: Creating lock for fail 34350 1726853749.70348: worker is 1 (out of 1 available) 34350 1726853749.70361: exiting _queue_task() for managed_node1/fail 34350 1726853749.70374: done queuing things up, now waiting for results queue to drain 34350 1726853749.70375: waiting for pending results... 34350 1726853749.70529: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34350 1726853749.70618: in run() - task 02083763-bbaf-b6c1-0de4-000000000019 34350 1726853749.70622: variable 'ansible_search_path' from source: unknown 34350 1726853749.70625: variable 'ansible_search_path' from source: unknown 34350 1726853749.70656: calling self._execute() 34350 1726853749.70716: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.70720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.70732: variable 'omit' from source: magic vars 34350 1726853749.70994: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.71003: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.71083: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.71087: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.71090: when evaluation is False, skipping this task 34350 1726853749.71093: _execute() done 34350 1726853749.71095: dumping result to json 34350 1726853749.71102: done dumping result, returning 34350 1726853749.71108: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-b6c1-0de4-000000000019] 34350 1726853749.71111: sending task result for task 02083763-bbaf-b6c1-0de4-000000000019 34350 1726853749.71197: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000019 34350 1726853749.71199: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853749.71239: no more pending results, returning what we have 34350 1726853749.71242: results queue empty 34350 1726853749.71243: checking for any_errors_fatal 34350 1726853749.71247: done checking for any_errors_fatal 34350 1726853749.71248: checking for max_fail_percentage 34350 1726853749.71249: done checking for max_fail_percentage 34350 1726853749.71250: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.71250: done checking to see if all hosts have failed 34350 1726853749.71251: getting the remaining hosts for this loop 34350 1726853749.71253: done getting the remaining hosts for this loop 34350 1726853749.71255: getting the next task for host managed_node1 34350 1726853749.71261: done getting next task for host managed_node1 34350 1726853749.71265: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34350 1726853749.71267: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.71281: getting variables 34350 1726853749.71282: in VariableManager get_vars() 34350 1726853749.71319: Calling all_inventory to load vars for managed_node1 34350 1726853749.71321: Calling groups_inventory to load vars for managed_node1 34350 1726853749.71323: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.71335: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.71337: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.71339: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.71478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.71594: done with get_vars() 34350 1726853749.71600: done getting variables 34350 1726853749.71638: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:35:49 -0400 (0:00:00.014) 0:00:03.589 ****** 34350 1726853749.71658: entering _queue_task() for managed_node1/fail 34350 1726853749.71827: worker is 1 (out of 1 available) 34350 1726853749.71841: exiting _queue_task() for managed_node1/fail 34350 1726853749.71851: done queuing things up, now waiting for results queue to drain 34350 1726853749.71852: waiting for pending results... 34350 1726853749.71996: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34350 1726853749.72075: in run() - task 02083763-bbaf-b6c1-0de4-00000000001a 34350 1726853749.72085: variable 'ansible_search_path' from source: unknown 34350 1726853749.72088: variable 'ansible_search_path' from source: unknown 34350 1726853749.72119: calling self._execute() 34350 1726853749.72170: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.72176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.72186: variable 'omit' from source: magic vars 34350 1726853749.72436: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.72448: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.72521: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.72526: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.72529: when evaluation is False, skipping this task 34350 1726853749.72532: _execute() done 34350 1726853749.72534: dumping result to json 34350 1726853749.72538: done dumping result, returning 34350 1726853749.72546: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-b6c1-0de4-00000000001a] 34350 1726853749.72549: sending task result for task 02083763-bbaf-b6c1-0de4-00000000001a 34350 1726853749.72631: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000001a 34350 1726853749.72634: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853749.72679: no more pending results, returning what we have 34350 1726853749.72682: results queue empty 34350 1726853749.72683: checking for any_errors_fatal 34350 1726853749.72689: done checking for any_errors_fatal 34350 1726853749.72689: checking for max_fail_percentage 34350 1726853749.72691: done checking for max_fail_percentage 34350 1726853749.72691: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.72692: done checking to see if all hosts have failed 34350 1726853749.72693: getting the remaining hosts for this loop 34350 1726853749.72694: done getting the remaining hosts for this loop 34350 1726853749.72697: getting the next task for host managed_node1 34350 1726853749.72702: done getting next task for host managed_node1 34350 1726853749.72705: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34350 1726853749.72708: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.72720: getting variables 34350 1726853749.72721: in VariableManager get_vars() 34350 1726853749.72763: Calling all_inventory to load vars for managed_node1 34350 1726853749.72766: Calling groups_inventory to load vars for managed_node1 34350 1726853749.72767: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.72776: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.72777: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.72780: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.72885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.73004: done with get_vars() 34350 1726853749.73011: done getting variables 34350 1726853749.73048: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:35:49 -0400 (0:00:00.014) 0:00:03.603 ****** 34350 1726853749.73072: entering _queue_task() for managed_node1/fail 34350 1726853749.73233: worker is 1 (out of 1 available) 34350 1726853749.73247: exiting _queue_task() for managed_node1/fail 34350 1726853749.73256: done queuing things up, now waiting for results queue to drain 34350 1726853749.73260: waiting for pending results... 34350 1726853749.73413: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34350 1726853749.73492: in run() - task 02083763-bbaf-b6c1-0de4-00000000001b 34350 1726853749.73499: variable 'ansible_search_path' from source: unknown 34350 1726853749.73502: variable 'ansible_search_path' from source: unknown 34350 1726853749.73532: calling self._execute() 34350 1726853749.73592: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.73596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.73607: variable 'omit' from source: magic vars 34350 1726853749.73866: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.73876: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.73951: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.73955: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.73963: when evaluation is False, skipping this task 34350 1726853749.73966: _execute() done 34350 1726853749.73969: dumping result to json 34350 1726853749.73973: done dumping result, returning 34350 1726853749.73976: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-b6c1-0de4-00000000001b] 34350 1726853749.73979: sending task result for task 02083763-bbaf-b6c1-0de4-00000000001b 34350 1726853749.74060: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000001b 34350 1726853749.74063: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853749.74107: no more pending results, returning what we have 34350 1726853749.74111: results queue empty 34350 1726853749.74111: checking for any_errors_fatal 34350 1726853749.74115: done checking for any_errors_fatal 34350 1726853749.74116: checking for max_fail_percentage 34350 1726853749.74117: done checking for max_fail_percentage 34350 1726853749.74118: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.74119: done checking to see if all hosts have failed 34350 1726853749.74119: getting the remaining hosts for this loop 34350 1726853749.74121: done getting the remaining hosts for this loop 34350 1726853749.74124: getting the next task for host managed_node1 34350 1726853749.74128: done getting next task for host managed_node1 34350 1726853749.74132: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34350 1726853749.74134: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.74146: getting variables 34350 1726853749.74147: in VariableManager get_vars() 34350 1726853749.74192: Calling all_inventory to load vars for managed_node1 34350 1726853749.74194: Calling groups_inventory to load vars for managed_node1 34350 1726853749.74195: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.74201: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.74202: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.74204: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.74338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.74457: done with get_vars() 34350 1726853749.74464: done getting variables 34350 1726853749.74530: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:35:49 -0400 (0:00:00.014) 0:00:03.618 ****** 34350 1726853749.74549: entering _queue_task() for managed_node1/dnf 34350 1726853749.74710: worker is 1 (out of 1 available) 34350 1726853749.74720: exiting _queue_task() for managed_node1/dnf 34350 1726853749.74730: done queuing things up, now waiting for results queue to drain 34350 1726853749.74731: waiting for pending results... 34350 1726853749.74883: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34350 1726853749.74959: in run() - task 02083763-bbaf-b6c1-0de4-00000000001c 34350 1726853749.74972: variable 'ansible_search_path' from source: unknown 34350 1726853749.74976: variable 'ansible_search_path' from source: unknown 34350 1726853749.75002: calling self._execute() 34350 1726853749.75062: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.75066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.75082: variable 'omit' from source: magic vars 34350 1726853749.75333: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.75342: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.75420: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.75424: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.75427: when evaluation is False, skipping this task 34350 1726853749.75430: _execute() done 34350 1726853749.75432: dumping result to json 34350 1726853749.75436: done dumping result, returning 34350 1726853749.75444: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-b6c1-0de4-00000000001c] 34350 1726853749.75447: sending task result for task 02083763-bbaf-b6c1-0de4-00000000001c 34350 1726853749.75533: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000001c 34350 1726853749.75536: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853749.75583: no more pending results, returning what we have 34350 1726853749.75587: results queue empty 34350 1726853749.75587: checking for any_errors_fatal 34350 1726853749.75592: done checking for any_errors_fatal 34350 1726853749.75592: checking for max_fail_percentage 34350 1726853749.75594: done checking for max_fail_percentage 34350 1726853749.75595: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.75595: done checking to see if all hosts have failed 34350 1726853749.75596: getting the remaining hosts for this loop 34350 1726853749.75597: done getting the remaining hosts for this loop 34350 1726853749.75600: getting the next task for host managed_node1 34350 1726853749.75604: done getting next task for host managed_node1 34350 1726853749.75607: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34350 1726853749.75609: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.75622: getting variables 34350 1726853749.75623: in VariableManager get_vars() 34350 1726853749.75665: Calling all_inventory to load vars for managed_node1 34350 1726853749.75667: Calling groups_inventory to load vars for managed_node1 34350 1726853749.75669: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.75677: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.75679: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.75681: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.75785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.75909: done with get_vars() 34350 1726853749.75915: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34350 1726853749.75965: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:35:49 -0400 (0:00:00.014) 0:00:03.632 ****** 34350 1726853749.75986: entering _queue_task() for managed_node1/yum 34350 1726853749.75987: Creating lock for yum 34350 1726853749.76164: worker is 1 (out of 1 available) 34350 1726853749.76178: exiting _queue_task() for managed_node1/yum 34350 1726853749.76188: done queuing things up, now waiting for results queue to drain 34350 1726853749.76190: waiting for pending results... 34350 1726853749.76341: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34350 1726853749.76419: in run() - task 02083763-bbaf-b6c1-0de4-00000000001d 34350 1726853749.76429: variable 'ansible_search_path' from source: unknown 34350 1726853749.76435: variable 'ansible_search_path' from source: unknown 34350 1726853749.76466: calling self._execute() 34350 1726853749.76527: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.76530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.76542: variable 'omit' from source: magic vars 34350 1726853749.76843: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.76853: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.76932: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.76936: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.76939: when evaluation is False, skipping this task 34350 1726853749.76942: _execute() done 34350 1726853749.76945: dumping result to json 34350 1726853749.76947: done dumping result, returning 34350 1726853749.76955: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-b6c1-0de4-00000000001d] 34350 1726853749.76966: sending task result for task 02083763-bbaf-b6c1-0de4-00000000001d 34350 1726853749.77048: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000001d 34350 1726853749.77051: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853749.77121: no more pending results, returning what we have 34350 1726853749.77124: results queue empty 34350 1726853749.77124: checking for any_errors_fatal 34350 1726853749.77128: done checking for any_errors_fatal 34350 1726853749.77128: checking for max_fail_percentage 34350 1726853749.77130: done checking for max_fail_percentage 34350 1726853749.77130: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.77131: done checking to see if all hosts have failed 34350 1726853749.77132: getting the remaining hosts for this loop 34350 1726853749.77133: done getting the remaining hosts for this loop 34350 1726853749.77135: getting the next task for host managed_node1 34350 1726853749.77140: done getting next task for host managed_node1 34350 1726853749.77143: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34350 1726853749.77145: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.77157: getting variables 34350 1726853749.77161: in VariableManager get_vars() 34350 1726853749.77195: Calling all_inventory to load vars for managed_node1 34350 1726853749.77197: Calling groups_inventory to load vars for managed_node1 34350 1726853749.77200: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.77208: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.77209: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.77211: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.77344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.77465: done with get_vars() 34350 1726853749.77473: done getting variables 34350 1726853749.77511: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:35:49 -0400 (0:00:00.015) 0:00:03.648 ****** 34350 1726853749.77533: entering _queue_task() for managed_node1/fail 34350 1726853749.77707: worker is 1 (out of 1 available) 34350 1726853749.77719: exiting _queue_task() for managed_node1/fail 34350 1726853749.77730: done queuing things up, now waiting for results queue to drain 34350 1726853749.77731: waiting for pending results... 34350 1726853749.77877: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34350 1726853749.77946: in run() - task 02083763-bbaf-b6c1-0de4-00000000001e 34350 1726853749.77962: variable 'ansible_search_path' from source: unknown 34350 1726853749.77966: variable 'ansible_search_path' from source: unknown 34350 1726853749.77991: calling self._execute() 34350 1726853749.78044: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.78047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.78056: variable 'omit' from source: magic vars 34350 1726853749.78303: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.78312: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.78389: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.78392: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.78396: when evaluation is False, skipping this task 34350 1726853749.78399: _execute() done 34350 1726853749.78403: dumping result to json 34350 1726853749.78406: done dumping result, returning 34350 1726853749.78409: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-b6c1-0de4-00000000001e] 34350 1726853749.78422: sending task result for task 02083763-bbaf-b6c1-0de4-00000000001e 34350 1726853749.78497: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000001e 34350 1726853749.78500: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853749.78556: no more pending results, returning what we have 34350 1726853749.78560: results queue empty 34350 1726853749.78561: checking for any_errors_fatal 34350 1726853749.78565: done checking for any_errors_fatal 34350 1726853749.78566: checking for max_fail_percentage 34350 1726853749.78567: done checking for max_fail_percentage 34350 1726853749.78568: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.78568: done checking to see if all hosts have failed 34350 1726853749.78569: getting the remaining hosts for this loop 34350 1726853749.78570: done getting the remaining hosts for this loop 34350 1726853749.78575: getting the next task for host managed_node1 34350 1726853749.78579: done getting next task for host managed_node1 34350 1726853749.78583: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34350 1726853749.78585: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.78596: getting variables 34350 1726853749.78598: in VariableManager get_vars() 34350 1726853749.78625: Calling all_inventory to load vars for managed_node1 34350 1726853749.78627: Calling groups_inventory to load vars for managed_node1 34350 1726853749.78628: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.78633: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.78636: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.78638: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.78740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.78879: done with get_vars() 34350 1726853749.78885: done getting variables 34350 1726853749.78922: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:35:49 -0400 (0:00:00.014) 0:00:03.662 ****** 34350 1726853749.78944: entering _queue_task() for managed_node1/package 34350 1726853749.79113: worker is 1 (out of 1 available) 34350 1726853749.79127: exiting _queue_task() for managed_node1/package 34350 1726853749.79136: done queuing things up, now waiting for results queue to drain 34350 1726853749.79137: waiting for pending results... 34350 1726853749.79275: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 34350 1726853749.79354: in run() - task 02083763-bbaf-b6c1-0de4-00000000001f 34350 1726853749.79374: variable 'ansible_search_path' from source: unknown 34350 1726853749.79378: variable 'ansible_search_path' from source: unknown 34350 1726853749.79404: calling self._execute() 34350 1726853749.79459: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.79476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.79481: variable 'omit' from source: magic vars 34350 1726853749.79730: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.79740: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.79818: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.79822: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.79824: when evaluation is False, skipping this task 34350 1726853749.79829: _execute() done 34350 1726853749.79832: dumping result to json 34350 1726853749.79834: done dumping result, returning 34350 1726853749.79842: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-b6c1-0de4-00000000001f] 34350 1726853749.79847: sending task result for task 02083763-bbaf-b6c1-0de4-00000000001f 34350 1726853749.79933: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000001f 34350 1726853749.79936: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853749.79983: no more pending results, returning what we have 34350 1726853749.79986: results queue empty 34350 1726853749.79987: checking for any_errors_fatal 34350 1726853749.79992: done checking for any_errors_fatal 34350 1726853749.79993: checking for max_fail_percentage 34350 1726853749.79994: done checking for max_fail_percentage 34350 1726853749.79995: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.79996: done checking to see if all hosts have failed 34350 1726853749.79997: getting the remaining hosts for this loop 34350 1726853749.79998: done getting the remaining hosts for this loop 34350 1726853749.80000: getting the next task for host managed_node1 34350 1726853749.80005: done getting next task for host managed_node1 34350 1726853749.80008: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34350 1726853749.80010: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.80022: getting variables 34350 1726853749.80023: in VariableManager get_vars() 34350 1726853749.80060: Calling all_inventory to load vars for managed_node1 34350 1726853749.80063: Calling groups_inventory to load vars for managed_node1 34350 1726853749.80065: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.80073: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.80075: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.80077: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.80180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.80299: done with get_vars() 34350 1726853749.80305: done getting variables 34350 1726853749.80341: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:35:49 -0400 (0:00:00.014) 0:00:03.676 ****** 34350 1726853749.80361: entering _queue_task() for managed_node1/package 34350 1726853749.80528: worker is 1 (out of 1 available) 34350 1726853749.80540: exiting _queue_task() for managed_node1/package 34350 1726853749.80549: done queuing things up, now waiting for results queue to drain 34350 1726853749.80550: waiting for pending results... 34350 1726853749.80688: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34350 1726853749.80754: in run() - task 02083763-bbaf-b6c1-0de4-000000000020 34350 1726853749.80768: variable 'ansible_search_path' from source: unknown 34350 1726853749.80775: variable 'ansible_search_path' from source: unknown 34350 1726853749.80801: calling self._execute() 34350 1726853749.80855: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.80858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.80869: variable 'omit' from source: magic vars 34350 1726853749.81118: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.81122: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.81197: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.81200: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.81205: when evaluation is False, skipping this task 34350 1726853749.81207: _execute() done 34350 1726853749.81210: dumping result to json 34350 1726853749.81214: done dumping result, returning 34350 1726853749.81229: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-b6c1-0de4-000000000020] 34350 1726853749.81233: sending task result for task 02083763-bbaf-b6c1-0de4-000000000020 34350 1726853749.81309: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000020 34350 1726853749.81312: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853749.81366: no more pending results, returning what we have 34350 1726853749.81369: results queue empty 34350 1726853749.81370: checking for any_errors_fatal 34350 1726853749.81384: done checking for any_errors_fatal 34350 1726853749.81385: checking for max_fail_percentage 34350 1726853749.81387: done checking for max_fail_percentage 34350 1726853749.81388: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.81388: done checking to see if all hosts have failed 34350 1726853749.81389: getting the remaining hosts for this loop 34350 1726853749.81390: done getting the remaining hosts for this loop 34350 1726853749.81393: getting the next task for host managed_node1 34350 1726853749.81398: done getting next task for host managed_node1 34350 1726853749.81401: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34350 1726853749.81403: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.81415: getting variables 34350 1726853749.81416: in VariableManager get_vars() 34350 1726853749.81445: Calling all_inventory to load vars for managed_node1 34350 1726853749.81447: Calling groups_inventory to load vars for managed_node1 34350 1726853749.81448: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.81454: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.81456: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.81457: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.81589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.81709: done with get_vars() 34350 1726853749.81715: done getting variables 34350 1726853749.81751: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:35:49 -0400 (0:00:00.014) 0:00:03.690 ****** 34350 1726853749.81774: entering _queue_task() for managed_node1/package 34350 1726853749.81933: worker is 1 (out of 1 available) 34350 1726853749.81946: exiting _queue_task() for managed_node1/package 34350 1726853749.81955: done queuing things up, now waiting for results queue to drain 34350 1726853749.81957: waiting for pending results... 34350 1726853749.82108: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34350 1726853749.82183: in run() - task 02083763-bbaf-b6c1-0de4-000000000021 34350 1726853749.82198: variable 'ansible_search_path' from source: unknown 34350 1726853749.82202: variable 'ansible_search_path' from source: unknown 34350 1726853749.82226: calling self._execute() 34350 1726853749.82286: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.82289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.82308: variable 'omit' from source: magic vars 34350 1726853749.82549: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.82557: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.82631: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.82642: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.82645: when evaluation is False, skipping this task 34350 1726853749.82647: _execute() done 34350 1726853749.82650: dumping result to json 34350 1726853749.82652: done dumping result, returning 34350 1726853749.82660: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-b6c1-0de4-000000000021] 34350 1726853749.82666: sending task result for task 02083763-bbaf-b6c1-0de4-000000000021 34350 1726853749.82748: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000021 34350 1726853749.82751: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853749.82795: no more pending results, returning what we have 34350 1726853749.82799: results queue empty 34350 1726853749.82800: checking for any_errors_fatal 34350 1726853749.82806: done checking for any_errors_fatal 34350 1726853749.82806: checking for max_fail_percentage 34350 1726853749.82808: done checking for max_fail_percentage 34350 1726853749.82808: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.82809: done checking to see if all hosts have failed 34350 1726853749.82810: getting the remaining hosts for this loop 34350 1726853749.82811: done getting the remaining hosts for this loop 34350 1726853749.82814: getting the next task for host managed_node1 34350 1726853749.82820: done getting next task for host managed_node1 34350 1726853749.82823: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34350 1726853749.82825: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.82837: getting variables 34350 1726853749.82839: in VariableManager get_vars() 34350 1726853749.82876: Calling all_inventory to load vars for managed_node1 34350 1726853749.82879: Calling groups_inventory to load vars for managed_node1 34350 1726853749.82881: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.82888: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.82890: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.82892: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.82994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.83112: done with get_vars() 34350 1726853749.83118: done getting variables 34350 1726853749.83184: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:35:49 -0400 (0:00:00.014) 0:00:03.704 ****** 34350 1726853749.83206: entering _queue_task() for managed_node1/service 34350 1726853749.83207: Creating lock for service 34350 1726853749.83379: worker is 1 (out of 1 available) 34350 1726853749.83393: exiting _queue_task() for managed_node1/service 34350 1726853749.83402: done queuing things up, now waiting for results queue to drain 34350 1726853749.83404: waiting for pending results... 34350 1726853749.83562: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34350 1726853749.83642: in run() - task 02083763-bbaf-b6c1-0de4-000000000022 34350 1726853749.83651: variable 'ansible_search_path' from source: unknown 34350 1726853749.83654: variable 'ansible_search_path' from source: unknown 34350 1726853749.83684: calling self._execute() 34350 1726853749.83748: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.83753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.83765: variable 'omit' from source: magic vars 34350 1726853749.84014: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.84023: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.84100: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.84104: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.84107: when evaluation is False, skipping this task 34350 1726853749.84110: _execute() done 34350 1726853749.84112: dumping result to json 34350 1726853749.84117: done dumping result, returning 34350 1726853749.84124: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-b6c1-0de4-000000000022] 34350 1726853749.84129: sending task result for task 02083763-bbaf-b6c1-0de4-000000000022 34350 1726853749.84212: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000022 34350 1726853749.84215: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853749.84256: no more pending results, returning what we have 34350 1726853749.84261: results queue empty 34350 1726853749.84262: checking for any_errors_fatal 34350 1726853749.84266: done checking for any_errors_fatal 34350 1726853749.84267: checking for max_fail_percentage 34350 1726853749.84268: done checking for max_fail_percentage 34350 1726853749.84269: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.84270: done checking to see if all hosts have failed 34350 1726853749.84273: getting the remaining hosts for this loop 34350 1726853749.84274: done getting the remaining hosts for this loop 34350 1726853749.84277: getting the next task for host managed_node1 34350 1726853749.84282: done getting next task for host managed_node1 34350 1726853749.84285: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34350 1726853749.84287: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.84300: getting variables 34350 1726853749.84301: in VariableManager get_vars() 34350 1726853749.84336: Calling all_inventory to load vars for managed_node1 34350 1726853749.84339: Calling groups_inventory to load vars for managed_node1 34350 1726853749.84341: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.84348: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.84350: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.84351: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.84487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.84604: done with get_vars() 34350 1726853749.84609: done getting variables 34350 1726853749.84646: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:35:49 -0400 (0:00:00.014) 0:00:03.719 ****** 34350 1726853749.84668: entering _queue_task() for managed_node1/service 34350 1726853749.84834: worker is 1 (out of 1 available) 34350 1726853749.84847: exiting _queue_task() for managed_node1/service 34350 1726853749.84860: done queuing things up, now waiting for results queue to drain 34350 1726853749.84862: waiting for pending results... 34350 1726853749.84999: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34350 1726853749.85079: in run() - task 02083763-bbaf-b6c1-0de4-000000000023 34350 1726853749.85090: variable 'ansible_search_path' from source: unknown 34350 1726853749.85093: variable 'ansible_search_path' from source: unknown 34350 1726853749.85122: calling self._execute() 34350 1726853749.85179: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.85182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.85192: variable 'omit' from source: magic vars 34350 1726853749.85430: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.85446: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.85517: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.85522: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.85525: when evaluation is False, skipping this task 34350 1726853749.85528: _execute() done 34350 1726853749.85531: dumping result to json 34350 1726853749.85533: done dumping result, returning 34350 1726853749.85539: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-b6c1-0de4-000000000023] 34350 1726853749.85544: sending task result for task 02083763-bbaf-b6c1-0de4-000000000023 34350 1726853749.85626: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000023 34350 1726853749.85629: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34350 1726853749.85685: no more pending results, returning what we have 34350 1726853749.85688: results queue empty 34350 1726853749.85689: checking for any_errors_fatal 34350 1726853749.85693: done checking for any_errors_fatal 34350 1726853749.85693: checking for max_fail_percentage 34350 1726853749.85695: done checking for max_fail_percentage 34350 1726853749.85695: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.85696: done checking to see if all hosts have failed 34350 1726853749.85697: getting the remaining hosts for this loop 34350 1726853749.85698: done getting the remaining hosts for this loop 34350 1726853749.85701: getting the next task for host managed_node1 34350 1726853749.85705: done getting next task for host managed_node1 34350 1726853749.85708: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34350 1726853749.85711: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.85723: getting variables 34350 1726853749.85724: in VariableManager get_vars() 34350 1726853749.85761: Calling all_inventory to load vars for managed_node1 34350 1726853749.85763: Calling groups_inventory to load vars for managed_node1 34350 1726853749.85764: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.85770: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.85774: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.85776: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.85880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.85999: done with get_vars() 34350 1726853749.86006: done getting variables 34350 1726853749.86043: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:35:49 -0400 (0:00:00.013) 0:00:03.733 ****** 34350 1726853749.86065: entering _queue_task() for managed_node1/service 34350 1726853749.86250: worker is 1 (out of 1 available) 34350 1726853749.86268: exiting _queue_task() for managed_node1/service 34350 1726853749.86280: done queuing things up, now waiting for results queue to drain 34350 1726853749.86281: waiting for pending results... 34350 1726853749.86425: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34350 1726853749.86498: in run() - task 02083763-bbaf-b6c1-0de4-000000000024 34350 1726853749.86515: variable 'ansible_search_path' from source: unknown 34350 1726853749.86519: variable 'ansible_search_path' from source: unknown 34350 1726853749.86541: calling self._execute() 34350 1726853749.86601: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.86604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.86616: variable 'omit' from source: magic vars 34350 1726853749.86863: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.86870: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.86945: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.86948: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.86952: when evaluation is False, skipping this task 34350 1726853749.86955: _execute() done 34350 1726853749.86960: dumping result to json 34350 1726853749.86963: done dumping result, returning 34350 1726853749.86966: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-b6c1-0de4-000000000024] 34350 1726853749.86979: sending task result for task 02083763-bbaf-b6c1-0de4-000000000024 34350 1726853749.87054: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000024 34350 1726853749.87057: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853749.87104: no more pending results, returning what we have 34350 1726853749.87107: results queue empty 34350 1726853749.87108: checking for any_errors_fatal 34350 1726853749.87111: done checking for any_errors_fatal 34350 1726853749.87112: checking for max_fail_percentage 34350 1726853749.87114: done checking for max_fail_percentage 34350 1726853749.87115: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.87116: done checking to see if all hosts have failed 34350 1726853749.87116: getting the remaining hosts for this loop 34350 1726853749.87117: done getting the remaining hosts for this loop 34350 1726853749.87120: getting the next task for host managed_node1 34350 1726853749.87125: done getting next task for host managed_node1 34350 1726853749.87128: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34350 1726853749.87131: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.87142: getting variables 34350 1726853749.87144: in VariableManager get_vars() 34350 1726853749.87182: Calling all_inventory to load vars for managed_node1 34350 1726853749.87185: Calling groups_inventory to load vars for managed_node1 34350 1726853749.87186: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.87193: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.87194: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.87196: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.87325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.87444: done with get_vars() 34350 1726853749.87450: done getting variables 34350 1726853749.87491: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:35:49 -0400 (0:00:00.014) 0:00:03.747 ****** 34350 1726853749.87513: entering _queue_task() for managed_node1/service 34350 1726853749.87685: worker is 1 (out of 1 available) 34350 1726853749.87699: exiting _queue_task() for managed_node1/service 34350 1726853749.87709: done queuing things up, now waiting for results queue to drain 34350 1726853749.87710: waiting for pending results... 34350 1726853749.87852: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 34350 1726853749.87923: in run() - task 02083763-bbaf-b6c1-0de4-000000000025 34350 1726853749.87934: variable 'ansible_search_path' from source: unknown 34350 1726853749.87939: variable 'ansible_search_path' from source: unknown 34350 1726853749.87967: calling self._execute() 34350 1726853749.88023: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.88027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.88036: variable 'omit' from source: magic vars 34350 1726853749.88283: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.88291: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.88364: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.88370: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.88374: when evaluation is False, skipping this task 34350 1726853749.88377: _execute() done 34350 1726853749.88381: dumping result to json 34350 1726853749.88384: done dumping result, returning 34350 1726853749.88387: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-b6c1-0de4-000000000025] 34350 1726853749.88389: sending task result for task 02083763-bbaf-b6c1-0de4-000000000025 34350 1726853749.88474: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000025 34350 1726853749.88476: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34350 1726853749.88533: no more pending results, returning what we have 34350 1726853749.88536: results queue empty 34350 1726853749.88537: checking for any_errors_fatal 34350 1726853749.88541: done checking for any_errors_fatal 34350 1726853749.88542: checking for max_fail_percentage 34350 1726853749.88543: done checking for max_fail_percentage 34350 1726853749.88544: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.88545: done checking to see if all hosts have failed 34350 1726853749.88545: getting the remaining hosts for this loop 34350 1726853749.88546: done getting the remaining hosts for this loop 34350 1726853749.88549: getting the next task for host managed_node1 34350 1726853749.88554: done getting next task for host managed_node1 34350 1726853749.88557: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34350 1726853749.88562: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.88576: getting variables 34350 1726853749.88578: in VariableManager get_vars() 34350 1726853749.88605: Calling all_inventory to load vars for managed_node1 34350 1726853749.88608: Calling groups_inventory to load vars for managed_node1 34350 1726853749.88611: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.88616: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.88618: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.88619: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.88723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.88842: done with get_vars() 34350 1726853749.88849: done getting variables 34350 1726853749.88890: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:35:49 -0400 (0:00:00.013) 0:00:03.761 ****** 34350 1726853749.88913: entering _queue_task() for managed_node1/copy 34350 1726853749.89089: worker is 1 (out of 1 available) 34350 1726853749.89102: exiting _queue_task() for managed_node1/copy 34350 1726853749.89112: done queuing things up, now waiting for results queue to drain 34350 1726853749.89113: waiting for pending results... 34350 1726853749.89255: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34350 1726853749.89322: in run() - task 02083763-bbaf-b6c1-0de4-000000000026 34350 1726853749.89334: variable 'ansible_search_path' from source: unknown 34350 1726853749.89338: variable 'ansible_search_path' from source: unknown 34350 1726853749.89364: calling self._execute() 34350 1726853749.89421: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.89425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.89434: variable 'omit' from source: magic vars 34350 1726853749.89728: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.89781: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.89813: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.89816: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.89819: when evaluation is False, skipping this task 34350 1726853749.89822: _execute() done 34350 1726853749.89825: dumping result to json 34350 1726853749.89829: done dumping result, returning 34350 1726853749.89840: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-b6c1-0de4-000000000026] 34350 1726853749.89843: sending task result for task 02083763-bbaf-b6c1-0de4-000000000026 34350 1726853749.89936: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000026 34350 1726853749.89939: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853749.89985: no more pending results, returning what we have 34350 1726853749.89988: results queue empty 34350 1726853749.89989: checking for any_errors_fatal 34350 1726853749.89994: done checking for any_errors_fatal 34350 1726853749.89994: checking for max_fail_percentage 34350 1726853749.89996: done checking for max_fail_percentage 34350 1726853749.89997: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.89997: done checking to see if all hosts have failed 34350 1726853749.89998: getting the remaining hosts for this loop 34350 1726853749.89999: done getting the remaining hosts for this loop 34350 1726853749.90002: getting the next task for host managed_node1 34350 1726853749.90006: done getting next task for host managed_node1 34350 1726853749.90009: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34350 1726853749.90011: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.90023: getting variables 34350 1726853749.90025: in VariableManager get_vars() 34350 1726853749.90060: Calling all_inventory to load vars for managed_node1 34350 1726853749.90062: Calling groups_inventory to load vars for managed_node1 34350 1726853749.90063: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.90070: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.90073: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.90075: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.90203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.90364: done with get_vars() 34350 1726853749.90375: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:35:49 -0400 (0:00:00.015) 0:00:03.777 ****** 34350 1726853749.90464: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34350 1726853749.90465: Creating lock for fedora.linux_system_roles.network_connections 34350 1726853749.90691: worker is 1 (out of 1 available) 34350 1726853749.90703: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34350 1726853749.90715: done queuing things up, now waiting for results queue to drain 34350 1726853749.90716: waiting for pending results... 34350 1726853749.90862: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34350 1726853749.90939: in run() - task 02083763-bbaf-b6c1-0de4-000000000027 34350 1726853749.90952: variable 'ansible_search_path' from source: unknown 34350 1726853749.90957: variable 'ansible_search_path' from source: unknown 34350 1726853749.90990: calling self._execute() 34350 1726853749.91048: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.91053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.91064: variable 'omit' from source: magic vars 34350 1726853749.91319: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.91328: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.91405: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.91409: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.91413: when evaluation is False, skipping this task 34350 1726853749.91418: _execute() done 34350 1726853749.91420: dumping result to json 34350 1726853749.91423: done dumping result, returning 34350 1726853749.91432: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-b6c1-0de4-000000000027] 34350 1726853749.91436: sending task result for task 02083763-bbaf-b6c1-0de4-000000000027 34350 1726853749.91523: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000027 34350 1726853749.91526: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853749.91572: no more pending results, returning what we have 34350 1726853749.91576: results queue empty 34350 1726853749.91577: checking for any_errors_fatal 34350 1726853749.91582: done checking for any_errors_fatal 34350 1726853749.91583: checking for max_fail_percentage 34350 1726853749.91584: done checking for max_fail_percentage 34350 1726853749.91585: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.91586: done checking to see if all hosts have failed 34350 1726853749.91586: getting the remaining hosts for this loop 34350 1726853749.91587: done getting the remaining hosts for this loop 34350 1726853749.91590: getting the next task for host managed_node1 34350 1726853749.91595: done getting next task for host managed_node1 34350 1726853749.91598: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34350 1726853749.91600: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.91612: getting variables 34350 1726853749.91613: in VariableManager get_vars() 34350 1726853749.91648: Calling all_inventory to load vars for managed_node1 34350 1726853749.91650: Calling groups_inventory to load vars for managed_node1 34350 1726853749.91653: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.91659: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.91661: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.91663: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.91763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.91898: done with get_vars() 34350 1726853749.91905: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:35:49 -0400 (0:00:00.014) 0:00:03.792 ****** 34350 1726853749.91954: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34350 1726853749.91955: Creating lock for fedora.linux_system_roles.network_state 34350 1726853749.92125: worker is 1 (out of 1 available) 34350 1726853749.92137: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34350 1726853749.92146: done queuing things up, now waiting for results queue to drain 34350 1726853749.92147: waiting for pending results... 34350 1726853749.92291: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 34350 1726853749.92476: in run() - task 02083763-bbaf-b6c1-0de4-000000000028 34350 1726853749.92480: variable 'ansible_search_path' from source: unknown 34350 1726853749.92483: variable 'ansible_search_path' from source: unknown 34350 1726853749.92486: calling self._execute() 34350 1726853749.92501: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.92514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.92527: variable 'omit' from source: magic vars 34350 1726853749.92866: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.92886: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.92996: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.93007: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.93013: when evaluation is False, skipping this task 34350 1726853749.93021: _execute() done 34350 1726853749.93027: dumping result to json 34350 1726853749.93033: done dumping result, returning 34350 1726853749.93044: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-b6c1-0de4-000000000028] 34350 1726853749.93052: sending task result for task 02083763-bbaf-b6c1-0de4-000000000028 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853749.93226: no more pending results, returning what we have 34350 1726853749.93230: results queue empty 34350 1726853749.93230: checking for any_errors_fatal 34350 1726853749.93239: done checking for any_errors_fatal 34350 1726853749.93240: checking for max_fail_percentage 34350 1726853749.93241: done checking for max_fail_percentage 34350 1726853749.93242: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.93243: done checking to see if all hosts have failed 34350 1726853749.93244: getting the remaining hosts for this loop 34350 1726853749.93245: done getting the remaining hosts for this loop 34350 1726853749.93248: getting the next task for host managed_node1 34350 1726853749.93255: done getting next task for host managed_node1 34350 1726853749.93259: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34350 1726853749.93261: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.93276: getting variables 34350 1726853749.93278: in VariableManager get_vars() 34350 1726853749.93315: Calling all_inventory to load vars for managed_node1 34350 1726853749.93317: Calling groups_inventory to load vars for managed_node1 34350 1726853749.93319: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.93327: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.93329: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.93332: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.93509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.93729: done with get_vars() 34350 1726853749.93738: done getting variables 34350 1726853749.93766: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000028 34350 1726853749.93769: WORKER PROCESS EXITING 34350 1726853749.93802: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:35:49 -0400 (0:00:00.018) 0:00:03.811 ****** 34350 1726853749.93831: entering _queue_task() for managed_node1/debug 34350 1726853749.94047: worker is 1 (out of 1 available) 34350 1726853749.94061: exiting _queue_task() for managed_node1/debug 34350 1726853749.94069: done queuing things up, now waiting for results queue to drain 34350 1726853749.94072: waiting for pending results... 34350 1726853749.94487: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34350 1726853749.94491: in run() - task 02083763-bbaf-b6c1-0de4-000000000029 34350 1726853749.94494: variable 'ansible_search_path' from source: unknown 34350 1726853749.94496: variable 'ansible_search_path' from source: unknown 34350 1726853749.94499: calling self._execute() 34350 1726853749.94566: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.94579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.94592: variable 'omit' from source: magic vars 34350 1726853749.94931: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.94951: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.95075: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.95160: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.95164: when evaluation is False, skipping this task 34350 1726853749.95166: _execute() done 34350 1726853749.95169: dumping result to json 34350 1726853749.95173: done dumping result, returning 34350 1726853749.95176: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-b6c1-0de4-000000000029] 34350 1726853749.95178: sending task result for task 02083763-bbaf-b6c1-0de4-000000000029 34350 1726853749.95238: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000029 34350 1726853749.95240: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853749.95307: no more pending results, returning what we have 34350 1726853749.95312: results queue empty 34350 1726853749.95313: checking for any_errors_fatal 34350 1726853749.95319: done checking for any_errors_fatal 34350 1726853749.95320: checking for max_fail_percentage 34350 1726853749.95322: done checking for max_fail_percentage 34350 1726853749.95323: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.95323: done checking to see if all hosts have failed 34350 1726853749.95324: getting the remaining hosts for this loop 34350 1726853749.95326: done getting the remaining hosts for this loop 34350 1726853749.95329: getting the next task for host managed_node1 34350 1726853749.95337: done getting next task for host managed_node1 34350 1726853749.95341: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34350 1726853749.95345: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.95362: getting variables 34350 1726853749.95364: in VariableManager get_vars() 34350 1726853749.95416: Calling all_inventory to load vars for managed_node1 34350 1726853749.95420: Calling groups_inventory to load vars for managed_node1 34350 1726853749.95422: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.95434: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.95437: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.95440: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.95746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.96006: done with get_vars() 34350 1726853749.96017: done getting variables 34350 1726853749.96055: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:35:49 -0400 (0:00:00.022) 0:00:03.833 ****** 34350 1726853749.96084: entering _queue_task() for managed_node1/debug 34350 1726853749.96265: worker is 1 (out of 1 available) 34350 1726853749.96279: exiting _queue_task() for managed_node1/debug 34350 1726853749.96289: done queuing things up, now waiting for results queue to drain 34350 1726853749.96291: waiting for pending results... 34350 1726853749.96460: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34350 1726853749.96538: in run() - task 02083763-bbaf-b6c1-0de4-00000000002a 34350 1726853749.96549: variable 'ansible_search_path' from source: unknown 34350 1726853749.96553: variable 'ansible_search_path' from source: unknown 34350 1726853749.96584: calling self._execute() 34350 1726853749.96642: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.96649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.96660: variable 'omit' from source: magic vars 34350 1726853749.96912: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.96920: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.96999: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.97003: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.97006: when evaluation is False, skipping this task 34350 1726853749.97008: _execute() done 34350 1726853749.97011: dumping result to json 34350 1726853749.97016: done dumping result, returning 34350 1726853749.97023: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-b6c1-0de4-00000000002a] 34350 1726853749.97027: sending task result for task 02083763-bbaf-b6c1-0de4-00000000002a 34350 1726853749.97109: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000002a 34350 1726853749.97112: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853749.97155: no more pending results, returning what we have 34350 1726853749.97160: results queue empty 34350 1726853749.97161: checking for any_errors_fatal 34350 1726853749.97167: done checking for any_errors_fatal 34350 1726853749.97167: checking for max_fail_percentage 34350 1726853749.97169: done checking for max_fail_percentage 34350 1726853749.97170: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.97172: done checking to see if all hosts have failed 34350 1726853749.97173: getting the remaining hosts for this loop 34350 1726853749.97174: done getting the remaining hosts for this loop 34350 1726853749.97177: getting the next task for host managed_node1 34350 1726853749.97182: done getting next task for host managed_node1 34350 1726853749.97186: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34350 1726853749.97188: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.97199: getting variables 34350 1726853749.97201: in VariableManager get_vars() 34350 1726853749.97238: Calling all_inventory to load vars for managed_node1 34350 1726853749.97240: Calling groups_inventory to load vars for managed_node1 34350 1726853749.97241: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.97247: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.97248: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.97250: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.97352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.97475: done with get_vars() 34350 1726853749.97482: done getting variables 34350 1726853749.97518: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:35:49 -0400 (0:00:00.014) 0:00:03.848 ****** 34350 1726853749.97538: entering _queue_task() for managed_node1/debug 34350 1726853749.97710: worker is 1 (out of 1 available) 34350 1726853749.97723: exiting _queue_task() for managed_node1/debug 34350 1726853749.97734: done queuing things up, now waiting for results queue to drain 34350 1726853749.97735: waiting for pending results... 34350 1726853749.97877: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34350 1726853749.97938: in run() - task 02083763-bbaf-b6c1-0de4-00000000002b 34350 1726853749.97949: variable 'ansible_search_path' from source: unknown 34350 1726853749.97952: variable 'ansible_search_path' from source: unknown 34350 1726853749.97987: calling self._execute() 34350 1726853749.98276: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853749.98279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853749.98282: variable 'omit' from source: magic vars 34350 1726853749.98392: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.98407: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853749.98518: variable 'ansible_distribution_major_version' from source: facts 34350 1726853749.98528: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853749.98535: when evaluation is False, skipping this task 34350 1726853749.98541: _execute() done 34350 1726853749.98548: dumping result to json 34350 1726853749.98555: done dumping result, returning 34350 1726853749.98566: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-b6c1-0de4-00000000002b] 34350 1726853749.98576: sending task result for task 02083763-bbaf-b6c1-0de4-00000000002b 34350 1726853749.98668: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000002b 34350 1726853749.98678: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853749.98738: no more pending results, returning what we have 34350 1726853749.98742: results queue empty 34350 1726853749.98742: checking for any_errors_fatal 34350 1726853749.98747: done checking for any_errors_fatal 34350 1726853749.98748: checking for max_fail_percentage 34350 1726853749.98749: done checking for max_fail_percentage 34350 1726853749.98750: checking to see if all hosts have failed and the running result is not ok 34350 1726853749.98751: done checking to see if all hosts have failed 34350 1726853749.98752: getting the remaining hosts for this loop 34350 1726853749.98753: done getting the remaining hosts for this loop 34350 1726853749.98757: getting the next task for host managed_node1 34350 1726853749.98767: done getting next task for host managed_node1 34350 1726853749.98772: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34350 1726853749.98775: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853749.98787: getting variables 34350 1726853749.98789: in VariableManager get_vars() 34350 1726853749.98825: Calling all_inventory to load vars for managed_node1 34350 1726853749.98828: Calling groups_inventory to load vars for managed_node1 34350 1726853749.98830: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853749.98837: Calling all_plugins_play to load vars for managed_node1 34350 1726853749.98840: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853749.98842: Calling groups_plugins_play to load vars for managed_node1 34350 1726853749.99063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853749.99296: done with get_vars() 34350 1726853749.99304: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:35:49 -0400 (0:00:00.018) 0:00:03.866 ****** 34350 1726853749.99403: entering _queue_task() for managed_node1/ping 34350 1726853749.99405: Creating lock for ping 34350 1726853749.99764: worker is 1 (out of 1 available) 34350 1726853749.99775: exiting _queue_task() for managed_node1/ping 34350 1726853749.99784: done queuing things up, now waiting for results queue to drain 34350 1726853749.99785: waiting for pending results... 34350 1726853749.99956: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 34350 1726853750.00065: in run() - task 02083763-bbaf-b6c1-0de4-00000000002c 34350 1726853750.00096: variable 'ansible_search_path' from source: unknown 34350 1726853750.00104: variable 'ansible_search_path' from source: unknown 34350 1726853750.00139: calling self._execute() 34350 1726853750.00230: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.00244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.00258: variable 'omit' from source: magic vars 34350 1726853750.00651: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.00670: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.00799: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.00810: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.00817: when evaluation is False, skipping this task 34350 1726853750.00825: _execute() done 34350 1726853750.00841: dumping result to json 34350 1726853750.00895: done dumping result, returning 34350 1726853750.00899: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-b6c1-0de4-00000000002c] 34350 1726853750.00901: sending task result for task 02083763-bbaf-b6c1-0de4-00000000002c 34350 1726853750.01078: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000002c 34350 1726853750.01081: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.01124: no more pending results, returning what we have 34350 1726853750.01127: results queue empty 34350 1726853750.01128: checking for any_errors_fatal 34350 1726853750.01133: done checking for any_errors_fatal 34350 1726853750.01134: checking for max_fail_percentage 34350 1726853750.01135: done checking for max_fail_percentage 34350 1726853750.01136: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.01137: done checking to see if all hosts have failed 34350 1726853750.01138: getting the remaining hosts for this loop 34350 1726853750.01139: done getting the remaining hosts for this loop 34350 1726853750.01143: getting the next task for host managed_node1 34350 1726853750.01151: done getting next task for host managed_node1 34350 1726853750.01153: ^ task is: TASK: meta (role_complete) 34350 1726853750.01155: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.01169: getting variables 34350 1726853750.01175: in VariableManager get_vars() 34350 1726853750.01227: Calling all_inventory to load vars for managed_node1 34350 1726853750.01230: Calling groups_inventory to load vars for managed_node1 34350 1726853750.01232: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.01241: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.01244: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.01247: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.01510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.01736: done with get_vars() 34350 1726853750.01746: done getting variables 34350 1726853750.01829: done queuing things up, now waiting for results queue to drain 34350 1726853750.01831: results queue empty 34350 1726853750.01832: checking for any_errors_fatal 34350 1726853750.01834: done checking for any_errors_fatal 34350 1726853750.01835: checking for max_fail_percentage 34350 1726853750.01836: done checking for max_fail_percentage 34350 1726853750.01837: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.01838: done checking to see if all hosts have failed 34350 1726853750.01838: getting the remaining hosts for this loop 34350 1726853750.01839: done getting the remaining hosts for this loop 34350 1726853750.01841: getting the next task for host managed_node1 34350 1726853750.01845: done getting next task for host managed_node1 34350 1726853750.01848: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34350 1726853750.01850: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.01858: getting variables 34350 1726853750.01860: in VariableManager get_vars() 34350 1726853750.01879: Calling all_inventory to load vars for managed_node1 34350 1726853750.01881: Calling groups_inventory to load vars for managed_node1 34350 1726853750.01883: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.01887: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.01894: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.01904: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.02056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.02505: done with get_vars() 34350 1726853750.02512: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:35:50 -0400 (0:00:00.031) 0:00:03.898 ****** 34350 1726853750.02588: entering _queue_task() for managed_node1/include_tasks 34350 1726853750.02837: worker is 1 (out of 1 available) 34350 1726853750.02846: exiting _queue_task() for managed_node1/include_tasks 34350 1726853750.02856: done queuing things up, now waiting for results queue to drain 34350 1726853750.02857: waiting for pending results... 34350 1726853750.03203: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34350 1726853750.03273: in run() - task 02083763-bbaf-b6c1-0de4-000000000063 34350 1726853750.03304: variable 'ansible_search_path' from source: unknown 34350 1726853750.03312: variable 'ansible_search_path' from source: unknown 34350 1726853750.03350: calling self._execute() 34350 1726853750.03445: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.03477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.03480: variable 'omit' from source: magic vars 34350 1726853750.03862: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.03959: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.04013: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.04026: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.04036: when evaluation is False, skipping this task 34350 1726853750.04044: _execute() done 34350 1726853750.04050: dumping result to json 34350 1726853750.04077: done dumping result, returning 34350 1726853750.04179: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-b6c1-0de4-000000000063] 34350 1726853750.04183: sending task result for task 02083763-bbaf-b6c1-0de4-000000000063 34350 1726853750.04258: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000063 34350 1726853750.04262: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.04323: no more pending results, returning what we have 34350 1726853750.04328: results queue empty 34350 1726853750.04328: checking for any_errors_fatal 34350 1726853750.04330: done checking for any_errors_fatal 34350 1726853750.04330: checking for max_fail_percentage 34350 1726853750.04331: done checking for max_fail_percentage 34350 1726853750.04332: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.04333: done checking to see if all hosts have failed 34350 1726853750.04334: getting the remaining hosts for this loop 34350 1726853750.04336: done getting the remaining hosts for this loop 34350 1726853750.04339: getting the next task for host managed_node1 34350 1726853750.04346: done getting next task for host managed_node1 34350 1726853750.04349: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34350 1726853750.04352: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.04368: getting variables 34350 1726853750.04369: in VariableManager get_vars() 34350 1726853750.04508: Calling all_inventory to load vars for managed_node1 34350 1726853750.04511: Calling groups_inventory to load vars for managed_node1 34350 1726853750.04512: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.04520: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.04522: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.04524: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.04773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.05026: done with get_vars() 34350 1726853750.05037: done getting variables 34350 1726853750.05095: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:35:50 -0400 (0:00:00.025) 0:00:03.924 ****** 34350 1726853750.05125: entering _queue_task() for managed_node1/debug 34350 1726853750.05369: worker is 1 (out of 1 available) 34350 1726853750.05385: exiting _queue_task() for managed_node1/debug 34350 1726853750.05397: done queuing things up, now waiting for results queue to drain 34350 1726853750.05398: waiting for pending results... 34350 1726853750.05697: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 34350 1726853750.05764: in run() - task 02083763-bbaf-b6c1-0de4-000000000064 34350 1726853750.05786: variable 'ansible_search_path' from source: unknown 34350 1726853750.05798: variable 'ansible_search_path' from source: unknown 34350 1726853750.05836: calling self._execute() 34350 1726853750.05927: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.05938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.05951: variable 'omit' from source: magic vars 34350 1726853750.06336: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.06340: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.06446: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.06457: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.06472: when evaluation is False, skipping this task 34350 1726853750.06553: _execute() done 34350 1726853750.06556: dumping result to json 34350 1726853750.06561: done dumping result, returning 34350 1726853750.06563: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-b6c1-0de4-000000000064] 34350 1726853750.06565: sending task result for task 02083763-bbaf-b6c1-0de4-000000000064 34350 1726853750.06630: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000064 34350 1726853750.06633: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853750.06682: no more pending results, returning what we have 34350 1726853750.06686: results queue empty 34350 1726853750.06687: checking for any_errors_fatal 34350 1726853750.06693: done checking for any_errors_fatal 34350 1726853750.06694: checking for max_fail_percentage 34350 1726853750.06695: done checking for max_fail_percentage 34350 1726853750.06697: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.06697: done checking to see if all hosts have failed 34350 1726853750.06698: getting the remaining hosts for this loop 34350 1726853750.06700: done getting the remaining hosts for this loop 34350 1726853750.06703: getting the next task for host managed_node1 34350 1726853750.06711: done getting next task for host managed_node1 34350 1726853750.06714: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34350 1726853750.06717: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.06735: getting variables 34350 1726853750.06736: in VariableManager get_vars() 34350 1726853750.06786: Calling all_inventory to load vars for managed_node1 34350 1726853750.06789: Calling groups_inventory to load vars for managed_node1 34350 1726853750.06792: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.06803: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.06805: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.06808: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.07206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.07413: done with get_vars() 34350 1726853750.07422: done getting variables 34350 1726853750.07479: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:35:50 -0400 (0:00:00.023) 0:00:03.947 ****** 34350 1726853750.07507: entering _queue_task() for managed_node1/fail 34350 1726853750.07722: worker is 1 (out of 1 available) 34350 1726853750.07735: exiting _queue_task() for managed_node1/fail 34350 1726853750.07745: done queuing things up, now waiting for results queue to drain 34350 1726853750.07746: waiting for pending results... 34350 1726853750.08097: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34350 1726853750.08124: in run() - task 02083763-bbaf-b6c1-0de4-000000000065 34350 1726853750.08141: variable 'ansible_search_path' from source: unknown 34350 1726853750.08176: variable 'ansible_search_path' from source: unknown 34350 1726853750.08193: calling self._execute() 34350 1726853750.08276: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.08287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.08304: variable 'omit' from source: magic vars 34350 1726853750.08662: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.08776: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.08800: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.08809: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.08816: when evaluation is False, skipping this task 34350 1726853750.08822: _execute() done 34350 1726853750.08827: dumping result to json 34350 1726853750.08832: done dumping result, returning 34350 1726853750.08841: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-b6c1-0de4-000000000065] 34350 1726853750.08848: sending task result for task 02083763-bbaf-b6c1-0de4-000000000065 34350 1726853750.09077: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000065 34350 1726853750.09082: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.09119: no more pending results, returning what we have 34350 1726853750.09122: results queue empty 34350 1726853750.09123: checking for any_errors_fatal 34350 1726853750.09128: done checking for any_errors_fatal 34350 1726853750.09129: checking for max_fail_percentage 34350 1726853750.09131: done checking for max_fail_percentage 34350 1726853750.09131: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.09132: done checking to see if all hosts have failed 34350 1726853750.09133: getting the remaining hosts for this loop 34350 1726853750.09134: done getting the remaining hosts for this loop 34350 1726853750.09137: getting the next task for host managed_node1 34350 1726853750.09143: done getting next task for host managed_node1 34350 1726853750.09147: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34350 1726853750.09150: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.09168: getting variables 34350 1726853750.09170: in VariableManager get_vars() 34350 1726853750.09214: Calling all_inventory to load vars for managed_node1 34350 1726853750.09216: Calling groups_inventory to load vars for managed_node1 34350 1726853750.09219: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.09228: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.09230: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.09233: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.09473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.09679: done with get_vars() 34350 1726853750.09689: done getting variables 34350 1726853750.09746: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:35:50 -0400 (0:00:00.022) 0:00:03.970 ****** 34350 1726853750.09781: entering _queue_task() for managed_node1/fail 34350 1726853750.10016: worker is 1 (out of 1 available) 34350 1726853750.10027: exiting _queue_task() for managed_node1/fail 34350 1726853750.10038: done queuing things up, now waiting for results queue to drain 34350 1726853750.10039: waiting for pending results... 34350 1726853750.10313: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34350 1726853750.10437: in run() - task 02083763-bbaf-b6c1-0de4-000000000066 34350 1726853750.10456: variable 'ansible_search_path' from source: unknown 34350 1726853750.10469: variable 'ansible_search_path' from source: unknown 34350 1726853750.10514: calling self._execute() 34350 1726853750.10609: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.10621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.10637: variable 'omit' from source: magic vars 34350 1726853750.11040: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.11044: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.11161: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.11256: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.11262: when evaluation is False, skipping this task 34350 1726853750.11265: _execute() done 34350 1726853750.11268: dumping result to json 34350 1726853750.11270: done dumping result, returning 34350 1726853750.11274: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-b6c1-0de4-000000000066] 34350 1726853750.11277: sending task result for task 02083763-bbaf-b6c1-0de4-000000000066 34350 1726853750.11343: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000066 34350 1726853750.11346: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.11395: no more pending results, returning what we have 34350 1726853750.11399: results queue empty 34350 1726853750.11400: checking for any_errors_fatal 34350 1726853750.11407: done checking for any_errors_fatal 34350 1726853750.11408: checking for max_fail_percentage 34350 1726853750.11409: done checking for max_fail_percentage 34350 1726853750.11411: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.11412: done checking to see if all hosts have failed 34350 1726853750.11413: getting the remaining hosts for this loop 34350 1726853750.11414: done getting the remaining hosts for this loop 34350 1726853750.11418: getting the next task for host managed_node1 34350 1726853750.11426: done getting next task for host managed_node1 34350 1726853750.11430: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34350 1726853750.11434: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.11452: getting variables 34350 1726853750.11454: in VariableManager get_vars() 34350 1726853750.11505: Calling all_inventory to load vars for managed_node1 34350 1726853750.11507: Calling groups_inventory to load vars for managed_node1 34350 1726853750.11509: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.11520: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.11523: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.11526: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.11931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.12124: done with get_vars() 34350 1726853750.12132: done getting variables 34350 1726853750.12188: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:35:50 -0400 (0:00:00.024) 0:00:03.994 ****** 34350 1726853750.12217: entering _queue_task() for managed_node1/fail 34350 1726853750.12439: worker is 1 (out of 1 available) 34350 1726853750.12450: exiting _queue_task() for managed_node1/fail 34350 1726853750.12463: done queuing things up, now waiting for results queue to drain 34350 1726853750.12464: waiting for pending results... 34350 1726853750.12797: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34350 1726853750.12840: in run() - task 02083763-bbaf-b6c1-0de4-000000000067 34350 1726853750.12977: variable 'ansible_search_path' from source: unknown 34350 1726853750.12981: variable 'ansible_search_path' from source: unknown 34350 1726853750.12984: calling self._execute() 34350 1726853750.12992: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.13005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.13021: variable 'omit' from source: magic vars 34350 1726853750.13387: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.13405: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.13523: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.13539: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.13547: when evaluation is False, skipping this task 34350 1726853750.13553: _execute() done 34350 1726853750.13562: dumping result to json 34350 1726853750.13569: done dumping result, returning 34350 1726853750.13582: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-b6c1-0de4-000000000067] 34350 1726853750.13591: sending task result for task 02083763-bbaf-b6c1-0de4-000000000067 34350 1726853750.13709: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000067 34350 1726853750.13712: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.13793: no more pending results, returning what we have 34350 1726853750.13797: results queue empty 34350 1726853750.13798: checking for any_errors_fatal 34350 1726853750.13803: done checking for any_errors_fatal 34350 1726853750.13804: checking for max_fail_percentage 34350 1726853750.13806: done checking for max_fail_percentage 34350 1726853750.13807: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.13808: done checking to see if all hosts have failed 34350 1726853750.13809: getting the remaining hosts for this loop 34350 1726853750.13810: done getting the remaining hosts for this loop 34350 1726853750.13814: getting the next task for host managed_node1 34350 1726853750.13822: done getting next task for host managed_node1 34350 1726853750.13825: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34350 1726853750.13828: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.13845: getting variables 34350 1726853750.13847: in VariableManager get_vars() 34350 1726853750.13895: Calling all_inventory to load vars for managed_node1 34350 1726853750.13898: Calling groups_inventory to load vars for managed_node1 34350 1726853750.13901: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.13912: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.13915: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.13917: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.14240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.14455: done with get_vars() 34350 1726853750.14467: done getting variables 34350 1726853750.14520: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:35:50 -0400 (0:00:00.023) 0:00:04.018 ****** 34350 1726853750.14548: entering _queue_task() for managed_node1/dnf 34350 1726853750.14774: worker is 1 (out of 1 available) 34350 1726853750.14786: exiting _queue_task() for managed_node1/dnf 34350 1726853750.14796: done queuing things up, now waiting for results queue to drain 34350 1726853750.14797: waiting for pending results... 34350 1726853750.15191: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34350 1726853750.15197: in run() - task 02083763-bbaf-b6c1-0de4-000000000068 34350 1726853750.15200: variable 'ansible_search_path' from source: unknown 34350 1726853750.15202: variable 'ansible_search_path' from source: unknown 34350 1726853750.15237: calling self._execute() 34350 1726853750.15330: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.15341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.15355: variable 'omit' from source: magic vars 34350 1726853750.15697: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.15715: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.15845: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.15860: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.15869: when evaluation is False, skipping this task 34350 1726853750.15880: _execute() done 34350 1726853750.15887: dumping result to json 34350 1726853750.15895: done dumping result, returning 34350 1726853750.15906: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-b6c1-0de4-000000000068] 34350 1726853750.15916: sending task result for task 02083763-bbaf-b6c1-0de4-000000000068 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.16105: no more pending results, returning what we have 34350 1726853750.16110: results queue empty 34350 1726853750.16111: checking for any_errors_fatal 34350 1726853750.16118: done checking for any_errors_fatal 34350 1726853750.16119: checking for max_fail_percentage 34350 1726853750.16121: done checking for max_fail_percentage 34350 1726853750.16122: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.16123: done checking to see if all hosts have failed 34350 1726853750.16123: getting the remaining hosts for this loop 34350 1726853750.16125: done getting the remaining hosts for this loop 34350 1726853750.16129: getting the next task for host managed_node1 34350 1726853750.16138: done getting next task for host managed_node1 34350 1726853750.16142: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34350 1726853750.16145: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.16165: getting variables 34350 1726853750.16167: in VariableManager get_vars() 34350 1726853750.16219: Calling all_inventory to load vars for managed_node1 34350 1726853750.16222: Calling groups_inventory to load vars for managed_node1 34350 1726853750.16224: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.16237: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.16240: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.16243: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.16691: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000068 34350 1726853750.16695: WORKER PROCESS EXITING 34350 1726853750.16718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.16922: done with get_vars() 34350 1726853750.16933: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34350 1726853750.17011: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:35:50 -0400 (0:00:00.024) 0:00:04.043 ****** 34350 1726853750.17040: entering _queue_task() for managed_node1/yum 34350 1726853750.17291: worker is 1 (out of 1 available) 34350 1726853750.17302: exiting _queue_task() for managed_node1/yum 34350 1726853750.17312: done queuing things up, now waiting for results queue to drain 34350 1726853750.17313: waiting for pending results... 34350 1726853750.17582: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34350 1726853750.17707: in run() - task 02083763-bbaf-b6c1-0de4-000000000069 34350 1726853750.17724: variable 'ansible_search_path' from source: unknown 34350 1726853750.17732: variable 'ansible_search_path' from source: unknown 34350 1726853750.17770: calling self._execute() 34350 1726853750.17866: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.17880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.17892: variable 'omit' from source: magic vars 34350 1726853750.18247: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.18265: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.18382: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.18393: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.18400: when evaluation is False, skipping this task 34350 1726853750.18407: _execute() done 34350 1726853750.18413: dumping result to json 34350 1726853750.18421: done dumping result, returning 34350 1726853750.18431: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-b6c1-0de4-000000000069] 34350 1726853750.18440: sending task result for task 02083763-bbaf-b6c1-0de4-000000000069 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.18612: no more pending results, returning what we have 34350 1726853750.18616: results queue empty 34350 1726853750.18617: checking for any_errors_fatal 34350 1726853750.18623: done checking for any_errors_fatal 34350 1726853750.18624: checking for max_fail_percentage 34350 1726853750.18625: done checking for max_fail_percentage 34350 1726853750.18626: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.18627: done checking to see if all hosts have failed 34350 1726853750.18628: getting the remaining hosts for this loop 34350 1726853750.18630: done getting the remaining hosts for this loop 34350 1726853750.18633: getting the next task for host managed_node1 34350 1726853750.18642: done getting next task for host managed_node1 34350 1726853750.18646: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34350 1726853750.18648: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.18667: getting variables 34350 1726853750.18669: in VariableManager get_vars() 34350 1726853750.18717: Calling all_inventory to load vars for managed_node1 34350 1726853750.18721: Calling groups_inventory to load vars for managed_node1 34350 1726853750.18723: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.18735: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.18737: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.18740: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.19111: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000069 34350 1726853750.19114: WORKER PROCESS EXITING 34350 1726853750.19135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.19333: done with get_vars() 34350 1726853750.19343: done getting variables 34350 1726853750.19605: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:35:50 -0400 (0:00:00.025) 0:00:04.069 ****** 34350 1726853750.19635: entering _queue_task() for managed_node1/fail 34350 1726853750.20078: worker is 1 (out of 1 available) 34350 1726853750.20090: exiting _queue_task() for managed_node1/fail 34350 1726853750.20101: done queuing things up, now waiting for results queue to drain 34350 1726853750.20102: waiting for pending results... 34350 1726853750.20498: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34350 1726853750.20634: in run() - task 02083763-bbaf-b6c1-0de4-00000000006a 34350 1726853750.20654: variable 'ansible_search_path' from source: unknown 34350 1726853750.20664: variable 'ansible_search_path' from source: unknown 34350 1726853750.20702: calling self._execute() 34350 1726853750.20789: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.20799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.20811: variable 'omit' from source: magic vars 34350 1726853750.21168: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.21187: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.21303: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.21313: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.21319: when evaluation is False, skipping this task 34350 1726853750.21326: _execute() done 34350 1726853750.21332: dumping result to json 34350 1726853750.21339: done dumping result, returning 34350 1726853750.21349: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-b6c1-0de4-00000000006a] 34350 1726853750.21361: sending task result for task 02083763-bbaf-b6c1-0de4-00000000006a 34350 1726853750.21469: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000006a skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.21520: no more pending results, returning what we have 34350 1726853750.21523: results queue empty 34350 1726853750.21524: checking for any_errors_fatal 34350 1726853750.21530: done checking for any_errors_fatal 34350 1726853750.21531: checking for max_fail_percentage 34350 1726853750.21533: done checking for max_fail_percentage 34350 1726853750.21534: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.21534: done checking to see if all hosts have failed 34350 1726853750.21535: getting the remaining hosts for this loop 34350 1726853750.21537: done getting the remaining hosts for this loop 34350 1726853750.21540: getting the next task for host managed_node1 34350 1726853750.21548: done getting next task for host managed_node1 34350 1726853750.21551: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34350 1726853750.21554: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.21575: getting variables 34350 1726853750.21576: in VariableManager get_vars() 34350 1726853750.21623: Calling all_inventory to load vars for managed_node1 34350 1726853750.21626: Calling groups_inventory to load vars for managed_node1 34350 1726853750.21628: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.21639: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.21642: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.21645: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.22053: WORKER PROCESS EXITING 34350 1726853750.22077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.22266: done with get_vars() 34350 1726853750.22277: done getting variables 34350 1726853750.22334: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:35:50 -0400 (0:00:00.027) 0:00:04.096 ****** 34350 1726853750.22368: entering _queue_task() for managed_node1/package 34350 1726853750.22619: worker is 1 (out of 1 available) 34350 1726853750.22631: exiting _queue_task() for managed_node1/package 34350 1726853750.22644: done queuing things up, now waiting for results queue to drain 34350 1726853750.22645: waiting for pending results... 34350 1726853750.23242: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 34350 1726853750.23609: in run() - task 02083763-bbaf-b6c1-0de4-00000000006b 34350 1726853750.23628: variable 'ansible_search_path' from source: unknown 34350 1726853750.23637: variable 'ansible_search_path' from source: unknown 34350 1726853750.23683: calling self._execute() 34350 1726853750.23855: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.23870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.23926: variable 'omit' from source: magic vars 34350 1726853750.24341: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.24364: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.24485: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.24496: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.24503: when evaluation is False, skipping this task 34350 1726853750.24508: _execute() done 34350 1726853750.24513: dumping result to json 34350 1726853750.24519: done dumping result, returning 34350 1726853750.24527: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-b6c1-0de4-00000000006b] 34350 1726853750.24534: sending task result for task 02083763-bbaf-b6c1-0de4-00000000006b 34350 1726853750.24641: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000006b 34350 1726853750.24649: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.24721: no more pending results, returning what we have 34350 1726853750.24726: results queue empty 34350 1726853750.24727: checking for any_errors_fatal 34350 1726853750.24735: done checking for any_errors_fatal 34350 1726853750.24735: checking for max_fail_percentage 34350 1726853750.24737: done checking for max_fail_percentage 34350 1726853750.24738: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.24739: done checking to see if all hosts have failed 34350 1726853750.24740: getting the remaining hosts for this loop 34350 1726853750.24741: done getting the remaining hosts for this loop 34350 1726853750.24745: getting the next task for host managed_node1 34350 1726853750.24755: done getting next task for host managed_node1 34350 1726853750.24762: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34350 1726853750.24765: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.24787: getting variables 34350 1726853750.24789: in VariableManager get_vars() 34350 1726853750.24840: Calling all_inventory to load vars for managed_node1 34350 1726853750.24843: Calling groups_inventory to load vars for managed_node1 34350 1726853750.24846: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.24862: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.24865: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.24868: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.25136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.25439: done with get_vars() 34350 1726853750.25448: done getting variables 34350 1726853750.25501: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:35:50 -0400 (0:00:00.031) 0:00:04.128 ****** 34350 1726853750.25531: entering _queue_task() for managed_node1/package 34350 1726853750.25752: worker is 1 (out of 1 available) 34350 1726853750.25766: exiting _queue_task() for managed_node1/package 34350 1726853750.25777: done queuing things up, now waiting for results queue to drain 34350 1726853750.25779: waiting for pending results... 34350 1726853750.26034: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34350 1726853750.26154: in run() - task 02083763-bbaf-b6c1-0de4-00000000006c 34350 1726853750.26181: variable 'ansible_search_path' from source: unknown 34350 1726853750.26195: variable 'ansible_search_path' from source: unknown 34350 1726853750.26376: calling self._execute() 34350 1726853750.26380: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.26383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.26386: variable 'omit' from source: magic vars 34350 1726853750.26703: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.26724: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.26842: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.26853: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.26863: when evaluation is False, skipping this task 34350 1726853750.26870: _execute() done 34350 1726853750.26879: dumping result to json 34350 1726853750.26886: done dumping result, returning 34350 1726853750.26897: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-b6c1-0de4-00000000006c] 34350 1726853750.26906: sending task result for task 02083763-bbaf-b6c1-0de4-00000000006c skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.27083: no more pending results, returning what we have 34350 1726853750.27087: results queue empty 34350 1726853750.27088: checking for any_errors_fatal 34350 1726853750.27095: done checking for any_errors_fatal 34350 1726853750.27096: checking for max_fail_percentage 34350 1726853750.27098: done checking for max_fail_percentage 34350 1726853750.27099: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.27100: done checking to see if all hosts have failed 34350 1726853750.27101: getting the remaining hosts for this loop 34350 1726853750.27102: done getting the remaining hosts for this loop 34350 1726853750.27105: getting the next task for host managed_node1 34350 1726853750.27113: done getting next task for host managed_node1 34350 1726853750.27117: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34350 1726853750.27120: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.27137: getting variables 34350 1726853750.27139: in VariableManager get_vars() 34350 1726853750.27190: Calling all_inventory to load vars for managed_node1 34350 1726853750.27193: Calling groups_inventory to load vars for managed_node1 34350 1726853750.27195: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.27206: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.27209: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.27212: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.27606: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000006c 34350 1726853750.27609: WORKER PROCESS EXITING 34350 1726853750.27629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.27840: done with get_vars() 34350 1726853750.27849: done getting variables 34350 1726853750.27906: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:35:50 -0400 (0:00:00.024) 0:00:04.152 ****** 34350 1726853750.27935: entering _queue_task() for managed_node1/package 34350 1726853750.28167: worker is 1 (out of 1 available) 34350 1726853750.28180: exiting _queue_task() for managed_node1/package 34350 1726853750.28191: done queuing things up, now waiting for results queue to drain 34350 1726853750.28192: waiting for pending results... 34350 1726853750.28439: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34350 1726853750.28567: in run() - task 02083763-bbaf-b6c1-0de4-00000000006d 34350 1726853750.28588: variable 'ansible_search_path' from source: unknown 34350 1726853750.28595: variable 'ansible_search_path' from source: unknown 34350 1726853750.28631: calling self._execute() 34350 1726853750.28714: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.28725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.28737: variable 'omit' from source: magic vars 34350 1726853750.29089: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.29106: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.29223: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.29233: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.29240: when evaluation is False, skipping this task 34350 1726853750.29247: _execute() done 34350 1726853750.29253: dumping result to json 34350 1726853750.29264: done dumping result, returning 34350 1726853750.29277: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-b6c1-0de4-00000000006d] 34350 1726853750.29286: sending task result for task 02083763-bbaf-b6c1-0de4-00000000006d skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.29462: no more pending results, returning what we have 34350 1726853750.29466: results queue empty 34350 1726853750.29467: checking for any_errors_fatal 34350 1726853750.29475: done checking for any_errors_fatal 34350 1726853750.29476: checking for max_fail_percentage 34350 1726853750.29477: done checking for max_fail_percentage 34350 1726853750.29478: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.29479: done checking to see if all hosts have failed 34350 1726853750.29480: getting the remaining hosts for this loop 34350 1726853750.29482: done getting the remaining hosts for this loop 34350 1726853750.29485: getting the next task for host managed_node1 34350 1726853750.29493: done getting next task for host managed_node1 34350 1726853750.29496: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34350 1726853750.29499: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.29516: getting variables 34350 1726853750.29518: in VariableManager get_vars() 34350 1726853750.29566: Calling all_inventory to load vars for managed_node1 34350 1726853750.29569: Calling groups_inventory to load vars for managed_node1 34350 1726853750.29746: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.29753: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000006d 34350 1726853750.29756: WORKER PROCESS EXITING 34350 1726853750.29766: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.29769: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.29774: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.29939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.30148: done with get_vars() 34350 1726853750.30157: done getting variables 34350 1726853750.30214: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:35:50 -0400 (0:00:00.023) 0:00:04.175 ****** 34350 1726853750.30246: entering _queue_task() for managed_node1/service 34350 1726853750.30481: worker is 1 (out of 1 available) 34350 1726853750.30494: exiting _queue_task() for managed_node1/service 34350 1726853750.30505: done queuing things up, now waiting for results queue to drain 34350 1726853750.30507: waiting for pending results... 34350 1726853750.30761: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34350 1726853750.30892: in run() - task 02083763-bbaf-b6c1-0de4-00000000006e 34350 1726853750.30909: variable 'ansible_search_path' from source: unknown 34350 1726853750.30917: variable 'ansible_search_path' from source: unknown 34350 1726853750.30955: calling self._execute() 34350 1726853750.31048: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.31062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.31078: variable 'omit' from source: magic vars 34350 1726853750.31425: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.31441: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.31878: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.31882: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.31884: when evaluation is False, skipping this task 34350 1726853750.31887: _execute() done 34350 1726853750.31889: dumping result to json 34350 1726853750.31891: done dumping result, returning 34350 1726853750.31894: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-b6c1-0de4-00000000006e] 34350 1726853750.31896: sending task result for task 02083763-bbaf-b6c1-0de4-00000000006e 34350 1726853750.31962: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000006e 34350 1726853750.31965: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.32211: no more pending results, returning what we have 34350 1726853750.32214: results queue empty 34350 1726853750.32215: checking for any_errors_fatal 34350 1726853750.32220: done checking for any_errors_fatal 34350 1726853750.32221: checking for max_fail_percentage 34350 1726853750.32223: done checking for max_fail_percentage 34350 1726853750.32224: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.32225: done checking to see if all hosts have failed 34350 1726853750.32225: getting the remaining hosts for this loop 34350 1726853750.32227: done getting the remaining hosts for this loop 34350 1726853750.32230: getting the next task for host managed_node1 34350 1726853750.32237: done getting next task for host managed_node1 34350 1726853750.32241: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34350 1726853750.32243: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.32262: getting variables 34350 1726853750.32264: in VariableManager get_vars() 34350 1726853750.32310: Calling all_inventory to load vars for managed_node1 34350 1726853750.32313: Calling groups_inventory to load vars for managed_node1 34350 1726853750.32315: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.32324: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.32327: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.32330: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.32754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.33394: done with get_vars() 34350 1726853750.33404: done getting variables 34350 1726853750.33463: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:35:50 -0400 (0:00:00.032) 0:00:04.207 ****** 34350 1726853750.33496: entering _queue_task() for managed_node1/service 34350 1726853750.33951: worker is 1 (out of 1 available) 34350 1726853750.33968: exiting _queue_task() for managed_node1/service 34350 1726853750.34480: done queuing things up, now waiting for results queue to drain 34350 1726853750.34482: waiting for pending results... 34350 1726853750.34789: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34350 1726853750.34794: in run() - task 02083763-bbaf-b6c1-0de4-00000000006f 34350 1726853750.34797: variable 'ansible_search_path' from source: unknown 34350 1726853750.34799: variable 'ansible_search_path' from source: unknown 34350 1726853750.34802: calling self._execute() 34350 1726853750.35048: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.35059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.35077: variable 'omit' from source: magic vars 34350 1726853750.35796: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.35808: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.36138: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.36143: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.36146: when evaluation is False, skipping this task 34350 1726853750.36149: _execute() done 34350 1726853750.36153: dumping result to json 34350 1726853750.36156: done dumping result, returning 34350 1726853750.36164: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-b6c1-0de4-00000000006f] 34350 1726853750.36169: sending task result for task 02083763-bbaf-b6c1-0de4-00000000006f 34350 1726853750.36266: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000006f 34350 1726853750.36268: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34350 1726853750.36411: no more pending results, returning what we have 34350 1726853750.36415: results queue empty 34350 1726853750.36416: checking for any_errors_fatal 34350 1726853750.36423: done checking for any_errors_fatal 34350 1726853750.36424: checking for max_fail_percentage 34350 1726853750.36425: done checking for max_fail_percentage 34350 1726853750.36426: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.36427: done checking to see if all hosts have failed 34350 1726853750.36427: getting the remaining hosts for this loop 34350 1726853750.36430: done getting the remaining hosts for this loop 34350 1726853750.36433: getting the next task for host managed_node1 34350 1726853750.36441: done getting next task for host managed_node1 34350 1726853750.36444: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34350 1726853750.36447: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.36466: getting variables 34350 1726853750.36468: in VariableManager get_vars() 34350 1726853750.36515: Calling all_inventory to load vars for managed_node1 34350 1726853750.36518: Calling groups_inventory to load vars for managed_node1 34350 1726853750.36520: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.36531: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.36533: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.36536: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.36906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.37304: done with get_vars() 34350 1726853750.37314: done getting variables 34350 1726853750.37577: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:35:50 -0400 (0:00:00.041) 0:00:04.248 ****** 34350 1726853750.37610: entering _queue_task() for managed_node1/service 34350 1726853750.38072: worker is 1 (out of 1 available) 34350 1726853750.38085: exiting _queue_task() for managed_node1/service 34350 1726853750.38095: done queuing things up, now waiting for results queue to drain 34350 1726853750.38096: waiting for pending results... 34350 1726853750.38686: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34350 1726853750.38802: in run() - task 02083763-bbaf-b6c1-0de4-000000000070 34350 1726853750.38815: variable 'ansible_search_path' from source: unknown 34350 1726853750.38819: variable 'ansible_search_path' from source: unknown 34350 1726853750.38924: calling self._execute() 34350 1726853750.39116: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.39121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.39136: variable 'omit' from source: magic vars 34350 1726853750.39953: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.39964: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.40178: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.40183: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.40187: when evaluation is False, skipping this task 34350 1726853750.40249: _execute() done 34350 1726853750.40253: dumping result to json 34350 1726853750.40257: done dumping result, returning 34350 1726853750.40262: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-b6c1-0de4-000000000070] 34350 1726853750.40268: sending task result for task 02083763-bbaf-b6c1-0de4-000000000070 34350 1726853750.40505: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000070 34350 1726853750.40508: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.40555: no more pending results, returning what we have 34350 1726853750.40562: results queue empty 34350 1726853750.40563: checking for any_errors_fatal 34350 1726853750.40570: done checking for any_errors_fatal 34350 1726853750.40572: checking for max_fail_percentage 34350 1726853750.40574: done checking for max_fail_percentage 34350 1726853750.40575: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.40576: done checking to see if all hosts have failed 34350 1726853750.40577: getting the remaining hosts for this loop 34350 1726853750.40579: done getting the remaining hosts for this loop 34350 1726853750.40582: getting the next task for host managed_node1 34350 1726853750.40590: done getting next task for host managed_node1 34350 1726853750.40594: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34350 1726853750.40597: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.40615: getting variables 34350 1726853750.40617: in VariableManager get_vars() 34350 1726853750.40666: Calling all_inventory to load vars for managed_node1 34350 1726853750.40670: Calling groups_inventory to load vars for managed_node1 34350 1726853750.40876: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.40885: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.40888: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.40890: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.41303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.41703: done with get_vars() 34350 1726853750.41712: done getting variables 34350 1726853750.41776: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:35:50 -0400 (0:00:00.041) 0:00:04.290 ****** 34350 1726853750.41808: entering _queue_task() for managed_node1/service 34350 1726853750.42705: worker is 1 (out of 1 available) 34350 1726853750.42715: exiting _queue_task() for managed_node1/service 34350 1726853750.42725: done queuing things up, now waiting for results queue to drain 34350 1726853750.42726: waiting for pending results... 34350 1726853750.42965: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 34350 1726853750.43063: in run() - task 02083763-bbaf-b6c1-0de4-000000000071 34350 1726853750.43186: variable 'ansible_search_path' from source: unknown 34350 1726853750.43194: variable 'ansible_search_path' from source: unknown 34350 1726853750.43235: calling self._execute() 34350 1726853750.43495: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.43498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.43501: variable 'omit' from source: magic vars 34350 1726853750.44212: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.44228: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.44393: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.44537: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.44545: when evaluation is False, skipping this task 34350 1726853750.44552: _execute() done 34350 1726853750.44561: dumping result to json 34350 1726853750.44569: done dumping result, returning 34350 1726853750.44582: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-b6c1-0de4-000000000071] 34350 1726853750.44591: sending task result for task 02083763-bbaf-b6c1-0de4-000000000071 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34350 1726853750.44820: no more pending results, returning what we have 34350 1726853750.44824: results queue empty 34350 1726853750.44825: checking for any_errors_fatal 34350 1726853750.44832: done checking for any_errors_fatal 34350 1726853750.44832: checking for max_fail_percentage 34350 1726853750.44834: done checking for max_fail_percentage 34350 1726853750.44835: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.44836: done checking to see if all hosts have failed 34350 1726853750.44837: getting the remaining hosts for this loop 34350 1726853750.44838: done getting the remaining hosts for this loop 34350 1726853750.44842: getting the next task for host managed_node1 34350 1726853750.44850: done getting next task for host managed_node1 34350 1726853750.44854: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34350 1726853750.44860: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.44880: getting variables 34350 1726853750.44882: in VariableManager get_vars() 34350 1726853750.44932: Calling all_inventory to load vars for managed_node1 34350 1726853750.44935: Calling groups_inventory to load vars for managed_node1 34350 1726853750.44937: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.44949: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.44952: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.44955: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.45523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.46031: done with get_vars() 34350 1726853750.46043: done getting variables 34350 1726853750.46080: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000071 34350 1726853750.46083: WORKER PROCESS EXITING 34350 1726853750.46120: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:35:50 -0400 (0:00:00.043) 0:00:04.334 ****** 34350 1726853750.46155: entering _queue_task() for managed_node1/copy 34350 1726853750.46637: worker is 1 (out of 1 available) 34350 1726853750.46649: exiting _queue_task() for managed_node1/copy 34350 1726853750.46661: done queuing things up, now waiting for results queue to drain 34350 1726853750.46663: waiting for pending results... 34350 1726853750.47075: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34350 1726853750.47392: in run() - task 02083763-bbaf-b6c1-0de4-000000000072 34350 1726853750.47406: variable 'ansible_search_path' from source: unknown 34350 1726853750.47410: variable 'ansible_search_path' from source: unknown 34350 1726853750.47449: calling self._execute() 34350 1726853750.47647: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.47654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.47665: variable 'omit' from source: magic vars 34350 1726853750.48373: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.48397: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.48527: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.48538: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.48545: when evaluation is False, skipping this task 34350 1726853750.48552: _execute() done 34350 1726853750.48562: dumping result to json 34350 1726853750.48617: done dumping result, returning 34350 1726853750.48622: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-b6c1-0de4-000000000072] 34350 1726853750.48625: sending task result for task 02083763-bbaf-b6c1-0de4-000000000072 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.48855: no more pending results, returning what we have 34350 1726853750.48863: results queue empty 34350 1726853750.48864: checking for any_errors_fatal 34350 1726853750.48869: done checking for any_errors_fatal 34350 1726853750.48872: checking for max_fail_percentage 34350 1726853750.48874: done checking for max_fail_percentage 34350 1726853750.48876: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.48877: done checking to see if all hosts have failed 34350 1726853750.48878: getting the remaining hosts for this loop 34350 1726853750.48879: done getting the remaining hosts for this loop 34350 1726853750.48883: getting the next task for host managed_node1 34350 1726853750.48891: done getting next task for host managed_node1 34350 1726853750.48895: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34350 1726853750.48898: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.48916: getting variables 34350 1726853750.48918: in VariableManager get_vars() 34350 1726853750.49091: Calling all_inventory to load vars for managed_node1 34350 1726853750.49094: Calling groups_inventory to load vars for managed_node1 34350 1726853750.49097: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.49167: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.49172: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.49176: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.49492: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000072 34350 1726853750.49495: WORKER PROCESS EXITING 34350 1726853750.49523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.49745: done with get_vars() 34350 1726853750.49754: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:35:50 -0400 (0:00:00.036) 0:00:04.371 ****** 34350 1726853750.49852: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34350 1726853750.50118: worker is 1 (out of 1 available) 34350 1726853750.50131: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34350 1726853750.50254: done queuing things up, now waiting for results queue to drain 34350 1726853750.50256: waiting for pending results... 34350 1726853750.50433: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34350 1726853750.50556: in run() - task 02083763-bbaf-b6c1-0de4-000000000073 34350 1726853750.50609: variable 'ansible_search_path' from source: unknown 34350 1726853750.50615: variable 'ansible_search_path' from source: unknown 34350 1726853750.50649: calling self._execute() 34350 1726853750.50880: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.50883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.50885: variable 'omit' from source: magic vars 34350 1726853750.51547: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.51776: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.51840: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.52028: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.52032: when evaluation is False, skipping this task 34350 1726853750.52035: _execute() done 34350 1726853750.52037: dumping result to json 34350 1726853750.52039: done dumping result, returning 34350 1726853750.52042: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-b6c1-0de4-000000000073] 34350 1726853750.52044: sending task result for task 02083763-bbaf-b6c1-0de4-000000000073 34350 1726853750.52118: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000073 34350 1726853750.52122: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.52179: no more pending results, returning what we have 34350 1726853750.52183: results queue empty 34350 1726853750.52184: checking for any_errors_fatal 34350 1726853750.52190: done checking for any_errors_fatal 34350 1726853750.52191: checking for max_fail_percentage 34350 1726853750.52193: done checking for max_fail_percentage 34350 1726853750.52194: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.52194: done checking to see if all hosts have failed 34350 1726853750.52195: getting the remaining hosts for this loop 34350 1726853750.52197: done getting the remaining hosts for this loop 34350 1726853750.52200: getting the next task for host managed_node1 34350 1726853750.52208: done getting next task for host managed_node1 34350 1726853750.52212: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34350 1726853750.52215: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.52232: getting variables 34350 1726853750.52234: in VariableManager get_vars() 34350 1726853750.52287: Calling all_inventory to load vars for managed_node1 34350 1726853750.52290: Calling groups_inventory to load vars for managed_node1 34350 1726853750.52293: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.52304: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.52307: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.52311: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.52937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.53427: done with get_vars() 34350 1726853750.53437: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:35:50 -0400 (0:00:00.037) 0:00:04.409 ****** 34350 1726853750.53634: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34350 1726853750.54115: worker is 1 (out of 1 available) 34350 1726853750.54242: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34350 1726853750.54253: done queuing things up, now waiting for results queue to drain 34350 1726853750.54254: waiting for pending results... 34350 1726853750.54896: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 34350 1726853750.54901: in run() - task 02083763-bbaf-b6c1-0de4-000000000074 34350 1726853750.54904: variable 'ansible_search_path' from source: unknown 34350 1726853750.55096: variable 'ansible_search_path' from source: unknown 34350 1726853750.55102: calling self._execute() 34350 1726853750.55189: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.55204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.55219: variable 'omit' from source: magic vars 34350 1726853750.55589: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.55605: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.55721: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.55732: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.55744: when evaluation is False, skipping this task 34350 1726853750.55751: _execute() done 34350 1726853750.55758: dumping result to json 34350 1726853750.55765: done dumping result, returning 34350 1726853750.55777: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-b6c1-0de4-000000000074] 34350 1726853750.55786: sending task result for task 02083763-bbaf-b6c1-0de4-000000000074 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.55928: no more pending results, returning what we have 34350 1726853750.55932: results queue empty 34350 1726853750.55932: checking for any_errors_fatal 34350 1726853750.55939: done checking for any_errors_fatal 34350 1726853750.55940: checking for max_fail_percentage 34350 1726853750.55942: done checking for max_fail_percentage 34350 1726853750.55943: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.55944: done checking to see if all hosts have failed 34350 1726853750.55944: getting the remaining hosts for this loop 34350 1726853750.55946: done getting the remaining hosts for this loop 34350 1726853750.55949: getting the next task for host managed_node1 34350 1726853750.55956: done getting next task for host managed_node1 34350 1726853750.55962: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34350 1726853750.55965: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.55992: getting variables 34350 1726853750.55994: in VariableManager get_vars() 34350 1726853750.56039: Calling all_inventory to load vars for managed_node1 34350 1726853750.56042: Calling groups_inventory to load vars for managed_node1 34350 1726853750.56044: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.56057: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.56062: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.56066: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.56586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.56994: done with get_vars() 34350 1726853750.57004: done getting variables 34350 1726853750.57051: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000074 34350 1726853750.57055: WORKER PROCESS EXITING 34350 1726853750.57289: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:35:50 -0400 (0:00:00.036) 0:00:04.445 ****** 34350 1726853750.57318: entering _queue_task() for managed_node1/debug 34350 1726853750.57755: worker is 1 (out of 1 available) 34350 1726853750.57769: exiting _queue_task() for managed_node1/debug 34350 1726853750.57783: done queuing things up, now waiting for results queue to drain 34350 1726853750.57784: waiting for pending results... 34350 1726853750.58173: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34350 1726853750.58422: in run() - task 02083763-bbaf-b6c1-0de4-000000000075 34350 1726853750.58474: variable 'ansible_search_path' from source: unknown 34350 1726853750.58577: variable 'ansible_search_path' from source: unknown 34350 1726853750.58613: calling self._execute() 34350 1726853750.58978: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.58981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.58984: variable 'omit' from source: magic vars 34350 1726853750.59500: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.59640: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.59869: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.59883: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.59891: when evaluation is False, skipping this task 34350 1726853750.59898: _execute() done 34350 1726853750.59905: dumping result to json 34350 1726853750.59914: done dumping result, returning 34350 1726853750.59926: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-b6c1-0de4-000000000075] 34350 1726853750.59958: sending task result for task 02083763-bbaf-b6c1-0de4-000000000075 34350 1726853750.60238: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000075 34350 1726853750.60241: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853750.60318: no more pending results, returning what we have 34350 1726853750.60322: results queue empty 34350 1726853750.60323: checking for any_errors_fatal 34350 1726853750.60330: done checking for any_errors_fatal 34350 1726853750.60331: checking for max_fail_percentage 34350 1726853750.60332: done checking for max_fail_percentage 34350 1726853750.60333: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.60334: done checking to see if all hosts have failed 34350 1726853750.60335: getting the remaining hosts for this loop 34350 1726853750.60336: done getting the remaining hosts for this loop 34350 1726853750.60339: getting the next task for host managed_node1 34350 1726853750.60347: done getting next task for host managed_node1 34350 1726853750.60351: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34350 1726853750.60354: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.60374: getting variables 34350 1726853750.60376: in VariableManager get_vars() 34350 1726853750.60423: Calling all_inventory to load vars for managed_node1 34350 1726853750.60426: Calling groups_inventory to load vars for managed_node1 34350 1726853750.60428: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.60438: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.60440: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.60442: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.60642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.61148: done with get_vars() 34350 1726853750.61162: done getting variables 34350 1726853750.61347: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:35:50 -0400 (0:00:00.040) 0:00:04.486 ****** 34350 1726853750.61385: entering _queue_task() for managed_node1/debug 34350 1726853750.62124: worker is 1 (out of 1 available) 34350 1726853750.62139: exiting _queue_task() for managed_node1/debug 34350 1726853750.62150: done queuing things up, now waiting for results queue to drain 34350 1726853750.62151: waiting for pending results... 34350 1726853750.62619: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34350 1726853750.62901: in run() - task 02083763-bbaf-b6c1-0de4-000000000076 34350 1726853750.62924: variable 'ansible_search_path' from source: unknown 34350 1726853750.62933: variable 'ansible_search_path' from source: unknown 34350 1726853750.63216: calling self._execute() 34350 1726853750.63235: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.63241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.63254: variable 'omit' from source: magic vars 34350 1726853750.64178: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.64183: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.64317: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.64329: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.64336: when evaluation is False, skipping this task 34350 1726853750.64343: _execute() done 34350 1726853750.64382: dumping result to json 34350 1726853750.64391: done dumping result, returning 34350 1726853750.64408: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-b6c1-0de4-000000000076] 34350 1726853750.64418: sending task result for task 02083763-bbaf-b6c1-0de4-000000000076 34350 1726853750.64648: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000076 34350 1726853750.64652: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853750.64711: no more pending results, returning what we have 34350 1726853750.64714: results queue empty 34350 1726853750.64715: checking for any_errors_fatal 34350 1726853750.64722: done checking for any_errors_fatal 34350 1726853750.64722: checking for max_fail_percentage 34350 1726853750.64724: done checking for max_fail_percentage 34350 1726853750.64725: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.64726: done checking to see if all hosts have failed 34350 1726853750.64727: getting the remaining hosts for this loop 34350 1726853750.64728: done getting the remaining hosts for this loop 34350 1726853750.64732: getting the next task for host managed_node1 34350 1726853750.64743: done getting next task for host managed_node1 34350 1726853750.64747: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34350 1726853750.64749: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.64769: getting variables 34350 1726853750.64773: in VariableManager get_vars() 34350 1726853750.64823: Calling all_inventory to load vars for managed_node1 34350 1726853750.64826: Calling groups_inventory to load vars for managed_node1 34350 1726853750.64828: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.64840: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.64842: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.64844: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.65414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.65878: done with get_vars() 34350 1726853750.65894: done getting variables 34350 1726853750.66021: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:35:50 -0400 (0:00:00.046) 0:00:04.533 ****** 34350 1726853750.66057: entering _queue_task() for managed_node1/debug 34350 1726853750.66804: worker is 1 (out of 1 available) 34350 1726853750.66818: exiting _queue_task() for managed_node1/debug 34350 1726853750.66829: done queuing things up, now waiting for results queue to drain 34350 1726853750.66830: waiting for pending results... 34350 1726853750.67264: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34350 1726853750.68077: in run() - task 02083763-bbaf-b6c1-0de4-000000000077 34350 1726853750.68081: variable 'ansible_search_path' from source: unknown 34350 1726853750.68084: variable 'ansible_search_path' from source: unknown 34350 1726853750.68086: calling self._execute() 34350 1726853750.68089: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.68476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.68480: variable 'omit' from source: magic vars 34350 1726853750.69237: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.69376: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.69379: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.69382: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.69577: when evaluation is False, skipping this task 34350 1726853750.69581: _execute() done 34350 1726853750.69584: dumping result to json 34350 1726853750.69587: done dumping result, returning 34350 1726853750.69778: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-b6c1-0de4-000000000077] 34350 1726853750.69783: sending task result for task 02083763-bbaf-b6c1-0de4-000000000077 34350 1726853750.69849: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000077 34350 1726853750.69852: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853750.70096: no more pending results, returning what we have 34350 1726853750.70102: results queue empty 34350 1726853750.70103: checking for any_errors_fatal 34350 1726853750.70108: done checking for any_errors_fatal 34350 1726853750.70109: checking for max_fail_percentage 34350 1726853750.70110: done checking for max_fail_percentage 34350 1726853750.70111: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.70112: done checking to see if all hosts have failed 34350 1726853750.70112: getting the remaining hosts for this loop 34350 1726853750.70114: done getting the remaining hosts for this loop 34350 1726853750.70118: getting the next task for host managed_node1 34350 1726853750.70124: done getting next task for host managed_node1 34350 1726853750.70128: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34350 1726853750.70131: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.70147: getting variables 34350 1726853750.70149: in VariableManager get_vars() 34350 1726853750.70202: Calling all_inventory to load vars for managed_node1 34350 1726853750.70205: Calling groups_inventory to load vars for managed_node1 34350 1726853750.70207: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.70219: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.70222: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.70225: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.70830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.71251: done with get_vars() 34350 1726853750.71262: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:35:50 -0400 (0:00:00.053) 0:00:04.587 ****** 34350 1726853750.71435: entering _queue_task() for managed_node1/ping 34350 1726853750.72061: worker is 1 (out of 1 available) 34350 1726853750.72276: exiting _queue_task() for managed_node1/ping 34350 1726853750.72286: done queuing things up, now waiting for results queue to drain 34350 1726853750.72288: waiting for pending results... 34350 1726853750.72660: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 34350 1726853750.72897: in run() - task 02083763-bbaf-b6c1-0de4-000000000078 34350 1726853750.72918: variable 'ansible_search_path' from source: unknown 34350 1726853750.73327: variable 'ansible_search_path' from source: unknown 34350 1726853750.73331: calling self._execute() 34350 1726853750.73440: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.73454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.73473: variable 'omit' from source: magic vars 34350 1726853750.74670: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.74691: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.74933: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.75057: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.75076: when evaluation is False, skipping this task 34350 1726853750.75086: _execute() done 34350 1726853750.75288: dumping result to json 34350 1726853750.75291: done dumping result, returning 34350 1726853750.75294: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-b6c1-0de4-000000000078] 34350 1726853750.75296: sending task result for task 02083763-bbaf-b6c1-0de4-000000000078 34350 1726853750.75363: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000078 34350 1726853750.75367: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.75440: no more pending results, returning what we have 34350 1726853750.75445: results queue empty 34350 1726853750.75446: checking for any_errors_fatal 34350 1726853750.75454: done checking for any_errors_fatal 34350 1726853750.75455: checking for max_fail_percentage 34350 1726853750.75457: done checking for max_fail_percentage 34350 1726853750.75458: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.75459: done checking to see if all hosts have failed 34350 1726853750.75459: getting the remaining hosts for this loop 34350 1726853750.75461: done getting the remaining hosts for this loop 34350 1726853750.75465: getting the next task for host managed_node1 34350 1726853750.75477: done getting next task for host managed_node1 34350 1726853750.75480: ^ task is: TASK: meta (role_complete) 34350 1726853750.75483: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.75502: getting variables 34350 1726853750.75504: in VariableManager get_vars() 34350 1726853750.75557: Calling all_inventory to load vars for managed_node1 34350 1726853750.75561: Calling groups_inventory to load vars for managed_node1 34350 1726853750.75564: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.75882: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.75886: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.75890: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.76131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.76345: done with get_vars() 34350 1726853750.76355: done getting variables 34350 1726853750.76436: done queuing things up, now waiting for results queue to drain 34350 1726853750.76438: results queue empty 34350 1726853750.76439: checking for any_errors_fatal 34350 1726853750.76445: done checking for any_errors_fatal 34350 1726853750.76446: checking for max_fail_percentage 34350 1726853750.76447: done checking for max_fail_percentage 34350 1726853750.76448: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.76449: done checking to see if all hosts have failed 34350 1726853750.76449: getting the remaining hosts for this loop 34350 1726853750.76450: done getting the remaining hosts for this loop 34350 1726853750.76452: getting the next task for host managed_node1 34350 1726853750.76456: done getting next task for host managed_node1 34350 1726853750.76457: ^ task is: TASK: TEST: wireless connection with 802.1x TLS-EAP 34350 1726853750.76459: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.76461: getting variables 34350 1726853750.76462: in VariableManager get_vars() 34350 1726853750.76484: Calling all_inventory to load vars for managed_node1 34350 1726853750.76486: Calling groups_inventory to load vars for managed_node1 34350 1726853750.76487: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.76491: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.76493: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.76495: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.76607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.76804: done with get_vars() 34350 1726853750.76813: done getting variables 34350 1726853750.76855: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: wireless connection with 802.1x TLS-EAP] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:53 Friday 20 September 2024 13:35:50 -0400 (0:00:00.055) 0:00:04.642 ****** 34350 1726853750.76949: entering _queue_task() for managed_node1/debug 34350 1726853750.77598: worker is 1 (out of 1 available) 34350 1726853750.77607: exiting _queue_task() for managed_node1/debug 34350 1726853750.77616: done queuing things up, now waiting for results queue to drain 34350 1726853750.77617: waiting for pending results... 34350 1726853750.77699: running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with 802.1x TLS-EAP 34350 1726853750.77846: in run() - task 02083763-bbaf-b6c1-0de4-0000000000a8 34350 1726853750.77850: variable 'ansible_search_path' from source: unknown 34350 1726853750.77853: calling self._execute() 34350 1726853750.77953: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.77966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.77982: variable 'omit' from source: magic vars 34350 1726853750.78350: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.78388: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.78549: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.78561: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.78570: when evaluation is False, skipping this task 34350 1726853750.78605: _execute() done 34350 1726853750.78621: dumping result to json 34350 1726853750.78629: done dumping result, returning 34350 1726853750.78715: done running TaskExecutor() for managed_node1/TASK: TEST: wireless connection with 802.1x TLS-EAP [02083763-bbaf-b6c1-0de4-0000000000a8] 34350 1726853750.78718: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000a8 34350 1726853750.78789: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000a8 34350 1726853750.78792: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853750.78863: no more pending results, returning what we have 34350 1726853750.78868: results queue empty 34350 1726853750.78869: checking for any_errors_fatal 34350 1726853750.78873: done checking for any_errors_fatal 34350 1726853750.78874: checking for max_fail_percentage 34350 1726853750.78876: done checking for max_fail_percentage 34350 1726853750.78877: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.78878: done checking to see if all hosts have failed 34350 1726853750.78878: getting the remaining hosts for this loop 34350 1726853750.78880: done getting the remaining hosts for this loop 34350 1726853750.78884: getting the next task for host managed_node1 34350 1726853750.78897: done getting next task for host managed_node1 34350 1726853750.78908: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34350 1726853750.78913: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.78934: getting variables 34350 1726853750.78936: in VariableManager get_vars() 34350 1726853750.78992: Calling all_inventory to load vars for managed_node1 34350 1726853750.78995: Calling groups_inventory to load vars for managed_node1 34350 1726853750.78998: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.79186: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.79191: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.79194: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.79670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.79881: done with get_vars() 34350 1726853750.79890: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:35:50 -0400 (0:00:00.030) 0:00:04.672 ****** 34350 1726853750.79988: entering _queue_task() for managed_node1/include_tasks 34350 1726853750.80247: worker is 1 (out of 1 available) 34350 1726853750.80263: exiting _queue_task() for managed_node1/include_tasks 34350 1726853750.80389: done queuing things up, now waiting for results queue to drain 34350 1726853750.80391: waiting for pending results... 34350 1726853750.80688: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34350 1726853750.80701: in run() - task 02083763-bbaf-b6c1-0de4-0000000000b0 34350 1726853750.80727: variable 'ansible_search_path' from source: unknown 34350 1726853750.80787: variable 'ansible_search_path' from source: unknown 34350 1726853750.80790: calling self._execute() 34350 1726853750.80870: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.80885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.80904: variable 'omit' from source: magic vars 34350 1726853750.81668: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.81814: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.82136: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.82139: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.82142: when evaluation is False, skipping this task 34350 1726853750.82144: _execute() done 34350 1726853750.82146: dumping result to json 34350 1726853750.82148: done dumping result, returning 34350 1726853750.82150: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-b6c1-0de4-0000000000b0] 34350 1726853750.82152: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b0 34350 1726853750.82224: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b0 34350 1726853750.82227: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.82281: no more pending results, returning what we have 34350 1726853750.82284: results queue empty 34350 1726853750.82285: checking for any_errors_fatal 34350 1726853750.82294: done checking for any_errors_fatal 34350 1726853750.82295: checking for max_fail_percentage 34350 1726853750.82296: done checking for max_fail_percentage 34350 1726853750.82297: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.82298: done checking to see if all hosts have failed 34350 1726853750.82299: getting the remaining hosts for this loop 34350 1726853750.82300: done getting the remaining hosts for this loop 34350 1726853750.82303: getting the next task for host managed_node1 34350 1726853750.82311: done getting next task for host managed_node1 34350 1726853750.82315: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34350 1726853750.82317: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.82335: getting variables 34350 1726853750.82336: in VariableManager get_vars() 34350 1726853750.82382: Calling all_inventory to load vars for managed_node1 34350 1726853750.82385: Calling groups_inventory to load vars for managed_node1 34350 1726853750.82387: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.82401: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.82403: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.82406: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.83082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.83512: done with get_vars() 34350 1726853750.83522: done getting variables 34350 1726853750.83683: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:35:50 -0400 (0:00:00.037) 0:00:04.709 ****** 34350 1726853750.83714: entering _queue_task() for managed_node1/debug 34350 1726853750.84476: worker is 1 (out of 1 available) 34350 1726853750.84602: exiting _queue_task() for managed_node1/debug 34350 1726853750.84612: done queuing things up, now waiting for results queue to drain 34350 1726853750.84614: waiting for pending results... 34350 1726853750.85216: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 34350 1726853750.85302: in run() - task 02083763-bbaf-b6c1-0de4-0000000000b1 34350 1726853750.85422: variable 'ansible_search_path' from source: unknown 34350 1726853750.85426: variable 'ansible_search_path' from source: unknown 34350 1726853750.85458: calling self._execute() 34350 1726853750.85703: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.85751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.85755: variable 'omit' from source: magic vars 34350 1726853750.86532: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.86585: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.86849: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.86947: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.86950: when evaluation is False, skipping this task 34350 1726853750.86953: _execute() done 34350 1726853750.86956: dumping result to json 34350 1726853750.86958: done dumping result, returning 34350 1726853750.86960: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-b6c1-0de4-0000000000b1] 34350 1726853750.86962: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b1 34350 1726853750.87307: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b1 34350 1726853750.87311: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853750.87360: no more pending results, returning what we have 34350 1726853750.87365: results queue empty 34350 1726853750.87365: checking for any_errors_fatal 34350 1726853750.87379: done checking for any_errors_fatal 34350 1726853750.87381: checking for max_fail_percentage 34350 1726853750.87382: done checking for max_fail_percentage 34350 1726853750.87383: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.87384: done checking to see if all hosts have failed 34350 1726853750.87384: getting the remaining hosts for this loop 34350 1726853750.87386: done getting the remaining hosts for this loop 34350 1726853750.87389: getting the next task for host managed_node1 34350 1726853750.87395: done getting next task for host managed_node1 34350 1726853750.87399: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34350 1726853750.87403: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.87421: getting variables 34350 1726853750.87422: in VariableManager get_vars() 34350 1726853750.87675: Calling all_inventory to load vars for managed_node1 34350 1726853750.87678: Calling groups_inventory to load vars for managed_node1 34350 1726853750.87681: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.87690: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.87693: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.87697: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.88297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.88854: done with get_vars() 34350 1726853750.88867: done getting variables 34350 1726853750.89044: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:35:50 -0400 (0:00:00.054) 0:00:04.764 ****** 34350 1726853750.89128: entering _queue_task() for managed_node1/fail 34350 1726853750.89654: worker is 1 (out of 1 available) 34350 1726853750.89782: exiting _queue_task() for managed_node1/fail 34350 1726853750.89792: done queuing things up, now waiting for results queue to drain 34350 1726853750.89793: waiting for pending results... 34350 1726853750.90365: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34350 1726853750.90573: in run() - task 02083763-bbaf-b6c1-0de4-0000000000b2 34350 1726853750.90577: variable 'ansible_search_path' from source: unknown 34350 1726853750.90580: variable 'ansible_search_path' from source: unknown 34350 1726853750.90582: calling self._execute() 34350 1726853750.90726: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.90795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.90810: variable 'omit' from source: magic vars 34350 1726853750.91527: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.91591: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.91979: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.91983: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.91985: when evaluation is False, skipping this task 34350 1726853750.91987: _execute() done 34350 1726853750.91989: dumping result to json 34350 1726853750.91992: done dumping result, returning 34350 1726853750.91994: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-b6c1-0de4-0000000000b2] 34350 1726853750.91996: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b2 34350 1726853750.92068: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b2 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.92120: no more pending results, returning what we have 34350 1726853750.92124: results queue empty 34350 1726853750.92125: checking for any_errors_fatal 34350 1726853750.92133: done checking for any_errors_fatal 34350 1726853750.92134: checking for max_fail_percentage 34350 1726853750.92136: done checking for max_fail_percentage 34350 1726853750.92136: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.92137: done checking to see if all hosts have failed 34350 1726853750.92138: getting the remaining hosts for this loop 34350 1726853750.92139: done getting the remaining hosts for this loop 34350 1726853750.92167: getting the next task for host managed_node1 34350 1726853750.92177: done getting next task for host managed_node1 34350 1726853750.92181: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34350 1726853750.92185: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.92203: getting variables 34350 1726853750.92205: in VariableManager get_vars() 34350 1726853750.92250: Calling all_inventory to load vars for managed_node1 34350 1726853750.92253: Calling groups_inventory to load vars for managed_node1 34350 1726853750.92255: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.92482: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.92487: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.92491: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.92561: WORKER PROCESS EXITING 34350 1726853750.92853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.93162: done with get_vars() 34350 1726853750.93395: done getting variables 34350 1726853750.93454: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:35:50 -0400 (0:00:00.043) 0:00:04.807 ****** 34350 1726853750.93488: entering _queue_task() for managed_node1/fail 34350 1726853750.93997: worker is 1 (out of 1 available) 34350 1726853750.94010: exiting _queue_task() for managed_node1/fail 34350 1726853750.94021: done queuing things up, now waiting for results queue to drain 34350 1726853750.94022: waiting for pending results... 34350 1726853750.94436: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34350 1726853750.94939: in run() - task 02083763-bbaf-b6c1-0de4-0000000000b3 34350 1726853750.94963: variable 'ansible_search_path' from source: unknown 34350 1726853750.95262: variable 'ansible_search_path' from source: unknown 34350 1726853750.95266: calling self._execute() 34350 1726853750.95403: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853750.95488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853750.95555: variable 'omit' from source: magic vars 34350 1726853750.96316: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.96562: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853750.96690: variable 'ansible_distribution_major_version' from source: facts 34350 1726853750.96701: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853750.96709: when evaluation is False, skipping this task 34350 1726853750.96717: _execute() done 34350 1726853750.96724: dumping result to json 34350 1726853750.96731: done dumping result, returning 34350 1726853750.96742: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-b6c1-0de4-0000000000b3] 34350 1726853750.96750: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b3 34350 1726853750.97002: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b3 34350 1726853750.97005: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853750.97055: no more pending results, returning what we have 34350 1726853750.97062: results queue empty 34350 1726853750.97063: checking for any_errors_fatal 34350 1726853750.97069: done checking for any_errors_fatal 34350 1726853750.97073: checking for max_fail_percentage 34350 1726853750.97075: done checking for max_fail_percentage 34350 1726853750.97075: checking to see if all hosts have failed and the running result is not ok 34350 1726853750.97076: done checking to see if all hosts have failed 34350 1726853750.97077: getting the remaining hosts for this loop 34350 1726853750.97078: done getting the remaining hosts for this loop 34350 1726853750.97082: getting the next task for host managed_node1 34350 1726853750.97090: done getting next task for host managed_node1 34350 1726853750.97094: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34350 1726853750.97181: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853750.97219: getting variables 34350 1726853750.97222: in VariableManager get_vars() 34350 1726853750.97537: Calling all_inventory to load vars for managed_node1 34350 1726853750.97540: Calling groups_inventory to load vars for managed_node1 34350 1726853750.97542: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853750.97551: Calling all_plugins_play to load vars for managed_node1 34350 1726853750.97554: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853750.97560: Calling groups_plugins_play to load vars for managed_node1 34350 1726853750.98033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853750.98479: done with get_vars() 34350 1726853750.98490: done getting variables 34350 1726853750.98624: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:35:50 -0400 (0:00:00.051) 0:00:04.859 ****** 34350 1726853750.98663: entering _queue_task() for managed_node1/fail 34350 1726853750.99340: worker is 1 (out of 1 available) 34350 1726853750.99352: exiting _queue_task() for managed_node1/fail 34350 1726853750.99365: done queuing things up, now waiting for results queue to drain 34350 1726853750.99367: waiting for pending results... 34350 1726853750.99988: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34350 1726853751.00207: in run() - task 02083763-bbaf-b6c1-0de4-0000000000b4 34350 1726853751.00230: variable 'ansible_search_path' from source: unknown 34350 1726853751.00331: variable 'ansible_search_path' from source: unknown 34350 1726853751.00479: calling self._execute() 34350 1726853751.00678: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.00691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.00707: variable 'omit' from source: magic vars 34350 1726853751.01639: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.01642: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.01739: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.01857: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.01863: when evaluation is False, skipping this task 34350 1726853751.01866: _execute() done 34350 1726853751.01869: dumping result to json 34350 1726853751.01873: done dumping result, returning 34350 1726853751.01876: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-b6c1-0de4-0000000000b4] 34350 1726853751.01879: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b4 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.02228: no more pending results, returning what we have 34350 1726853751.02233: results queue empty 34350 1726853751.02234: checking for any_errors_fatal 34350 1726853751.02242: done checking for any_errors_fatal 34350 1726853751.02243: checking for max_fail_percentage 34350 1726853751.02246: done checking for max_fail_percentage 34350 1726853751.02246: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.02247: done checking to see if all hosts have failed 34350 1726853751.02248: getting the remaining hosts for this loop 34350 1726853751.02249: done getting the remaining hosts for this loop 34350 1726853751.02253: getting the next task for host managed_node1 34350 1726853751.02264: done getting next task for host managed_node1 34350 1726853751.02268: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34350 1726853751.02273: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853751.02293: getting variables 34350 1726853751.02295: in VariableManager get_vars() 34350 1726853751.02346: Calling all_inventory to load vars for managed_node1 34350 1726853751.02350: Calling groups_inventory to load vars for managed_node1 34350 1726853751.02352: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.02367: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.02673: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.02680: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.03032: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b4 34350 1726853751.03036: WORKER PROCESS EXITING 34350 1726853751.03062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.03502: done with get_vars() 34350 1726853751.03513: done getting variables 34350 1726853751.03677: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:35:51 -0400 (0:00:00.050) 0:00:04.909 ****** 34350 1726853751.03713: entering _queue_task() for managed_node1/dnf 34350 1726853751.04407: worker is 1 (out of 1 available) 34350 1726853751.04420: exiting _queue_task() for managed_node1/dnf 34350 1726853751.04431: done queuing things up, now waiting for results queue to drain 34350 1726853751.04433: waiting for pending results... 34350 1726853751.04934: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34350 1726853751.05267: in run() - task 02083763-bbaf-b6c1-0de4-0000000000b5 34350 1726853751.05290: variable 'ansible_search_path' from source: unknown 34350 1726853751.05300: variable 'ansible_search_path' from source: unknown 34350 1726853751.05349: calling self._execute() 34350 1726853751.05521: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.05762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.05767: variable 'omit' from source: magic vars 34350 1726853751.06631: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.06636: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.06781: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.06793: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.06801: when evaluation is False, skipping this task 34350 1726853751.06808: _execute() done 34350 1726853751.06815: dumping result to json 34350 1726853751.06855: done dumping result, returning 34350 1726853751.06873: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-b6c1-0de4-0000000000b5] 34350 1726853751.06884: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b5 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.07114: no more pending results, returning what we have 34350 1726853751.07118: results queue empty 34350 1726853751.07119: checking for any_errors_fatal 34350 1726853751.07126: done checking for any_errors_fatal 34350 1726853751.07127: checking for max_fail_percentage 34350 1726853751.07129: done checking for max_fail_percentage 34350 1726853751.07130: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.07131: done checking to see if all hosts have failed 34350 1726853751.07132: getting the remaining hosts for this loop 34350 1726853751.07133: done getting the remaining hosts for this loop 34350 1726853751.07137: getting the next task for host managed_node1 34350 1726853751.07145: done getting next task for host managed_node1 34350 1726853751.07149: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34350 1726853751.07152: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853751.07185: getting variables 34350 1726853751.07188: in VariableManager get_vars() 34350 1726853751.07241: Calling all_inventory to load vars for managed_node1 34350 1726853751.07244: Calling groups_inventory to load vars for managed_node1 34350 1726853751.07247: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.07262: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.07266: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.07269: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.08076: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b5 34350 1726853751.08080: WORKER PROCESS EXITING 34350 1726853751.08251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.08664: done with get_vars() 34350 1726853751.08785: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34350 1726853751.08863: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:35:51 -0400 (0:00:00.052) 0:00:04.962 ****** 34350 1726853751.08997: entering _queue_task() for managed_node1/yum 34350 1726853751.09596: worker is 1 (out of 1 available) 34350 1726853751.09609: exiting _queue_task() for managed_node1/yum 34350 1726853751.09674: done queuing things up, now waiting for results queue to drain 34350 1726853751.09676: waiting for pending results... 34350 1726853751.10183: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34350 1726853751.10328: in run() - task 02083763-bbaf-b6c1-0de4-0000000000b6 34350 1726853751.10348: variable 'ansible_search_path' from source: unknown 34350 1726853751.10393: variable 'ansible_search_path' from source: unknown 34350 1726853751.10433: calling self._execute() 34350 1726853751.10677: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.10682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.10686: variable 'omit' from source: magic vars 34350 1726853751.11426: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.11444: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.11665: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.11694: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.11876: when evaluation is False, skipping this task 34350 1726853751.11880: _execute() done 34350 1726853751.11882: dumping result to json 34350 1726853751.11885: done dumping result, returning 34350 1726853751.11887: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-b6c1-0de4-0000000000b6] 34350 1726853751.11890: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b6 34350 1726853751.11965: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b6 34350 1726853751.11970: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.12025: no more pending results, returning what we have 34350 1726853751.12030: results queue empty 34350 1726853751.12031: checking for any_errors_fatal 34350 1726853751.12037: done checking for any_errors_fatal 34350 1726853751.12038: checking for max_fail_percentage 34350 1726853751.12040: done checking for max_fail_percentage 34350 1726853751.12041: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.12042: done checking to see if all hosts have failed 34350 1726853751.12042: getting the remaining hosts for this loop 34350 1726853751.12044: done getting the remaining hosts for this loop 34350 1726853751.12048: getting the next task for host managed_node1 34350 1726853751.12056: done getting next task for host managed_node1 34350 1726853751.12063: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34350 1726853751.12066: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853751.12087: getting variables 34350 1726853751.12089: in VariableManager get_vars() 34350 1726853751.12135: Calling all_inventory to load vars for managed_node1 34350 1726853751.12138: Calling groups_inventory to load vars for managed_node1 34350 1726853751.12139: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.12151: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.12154: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.12156: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.12627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.13005: done with get_vars() 34350 1726853751.13016: done getting variables 34350 1726853751.13192: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:35:51 -0400 (0:00:00.042) 0:00:05.004 ****** 34350 1726853751.13223: entering _queue_task() for managed_node1/fail 34350 1726853751.13841: worker is 1 (out of 1 available) 34350 1726853751.13852: exiting _queue_task() for managed_node1/fail 34350 1726853751.13865: done queuing things up, now waiting for results queue to drain 34350 1726853751.13867: waiting for pending results... 34350 1726853751.14490: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34350 1726853751.14495: in run() - task 02083763-bbaf-b6c1-0de4-0000000000b7 34350 1726853751.14498: variable 'ansible_search_path' from source: unknown 34350 1726853751.14500: variable 'ansible_search_path' from source: unknown 34350 1726853751.14616: calling self._execute() 34350 1726853751.14792: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.14811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.14827: variable 'omit' from source: magic vars 34350 1726853751.15539: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.15561: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.15687: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.15698: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.15711: when evaluation is False, skipping this task 34350 1726853751.15719: _execute() done 34350 1726853751.15725: dumping result to json 34350 1726853751.15732: done dumping result, returning 34350 1726853751.15778: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-b6c1-0de4-0000000000b7] 34350 1726853751.15783: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b7 34350 1726853751.15995: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b7 34350 1726853751.15998: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.16116: no more pending results, returning what we have 34350 1726853751.16118: results queue empty 34350 1726853751.16119: checking for any_errors_fatal 34350 1726853751.16123: done checking for any_errors_fatal 34350 1726853751.16124: checking for max_fail_percentage 34350 1726853751.16126: done checking for max_fail_percentage 34350 1726853751.16126: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.16127: done checking to see if all hosts have failed 34350 1726853751.16128: getting the remaining hosts for this loop 34350 1726853751.16129: done getting the remaining hosts for this loop 34350 1726853751.16132: getting the next task for host managed_node1 34350 1726853751.16138: done getting next task for host managed_node1 34350 1726853751.16141: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34350 1726853751.16144: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853751.16163: getting variables 34350 1726853751.16164: in VariableManager get_vars() 34350 1726853751.16213: Calling all_inventory to load vars for managed_node1 34350 1726853751.16216: Calling groups_inventory to load vars for managed_node1 34350 1726853751.16218: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.16227: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.16230: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.16233: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.16741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.17457: done with get_vars() 34350 1726853751.17473: done getting variables 34350 1726853751.17601: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:35:51 -0400 (0:00:00.044) 0:00:05.049 ****** 34350 1726853751.17637: entering _queue_task() for managed_node1/package 34350 1726853751.18401: worker is 1 (out of 1 available) 34350 1726853751.18414: exiting _queue_task() for managed_node1/package 34350 1726853751.18424: done queuing things up, now waiting for results queue to drain 34350 1726853751.18425: waiting for pending results... 34350 1726853751.18805: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 34350 1726853751.18924: in run() - task 02083763-bbaf-b6c1-0de4-0000000000b8 34350 1726853751.19119: variable 'ansible_search_path' from source: unknown 34350 1726853751.19122: variable 'ansible_search_path' from source: unknown 34350 1726853751.19125: calling self._execute() 34350 1726853751.19279: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.19282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.19285: variable 'omit' from source: magic vars 34350 1726853751.19976: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.20052: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.20282: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.20372: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.20479: when evaluation is False, skipping this task 34350 1726853751.20482: _execute() done 34350 1726853751.20484: dumping result to json 34350 1726853751.20486: done dumping result, returning 34350 1726853751.20488: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-b6c1-0de4-0000000000b8] 34350 1726853751.20490: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b8 34350 1726853751.20562: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b8 34350 1726853751.20565: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.20620: no more pending results, returning what we have 34350 1726853751.20624: results queue empty 34350 1726853751.20625: checking for any_errors_fatal 34350 1726853751.20632: done checking for any_errors_fatal 34350 1726853751.20633: checking for max_fail_percentage 34350 1726853751.20636: done checking for max_fail_percentage 34350 1726853751.20637: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.20637: done checking to see if all hosts have failed 34350 1726853751.20638: getting the remaining hosts for this loop 34350 1726853751.20640: done getting the remaining hosts for this loop 34350 1726853751.20643: getting the next task for host managed_node1 34350 1726853751.20651: done getting next task for host managed_node1 34350 1726853751.20655: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34350 1726853751.20660: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853751.20681: getting variables 34350 1726853751.20683: in VariableManager get_vars() 34350 1726853751.20729: Calling all_inventory to load vars for managed_node1 34350 1726853751.20731: Calling groups_inventory to load vars for managed_node1 34350 1726853751.20733: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.20744: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.20746: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.20749: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.21135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.21625: done with get_vars() 34350 1726853751.21635: done getting variables 34350 1726853751.21809: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:35:51 -0400 (0:00:00.042) 0:00:05.091 ****** 34350 1726853751.21842: entering _queue_task() for managed_node1/package 34350 1726853751.22488: worker is 1 (out of 1 available) 34350 1726853751.22501: exiting _queue_task() for managed_node1/package 34350 1726853751.22515: done queuing things up, now waiting for results queue to drain 34350 1726853751.22517: waiting for pending results... 34350 1726853751.22850: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34350 1726853751.23563: in run() - task 02083763-bbaf-b6c1-0de4-0000000000b9 34350 1726853751.23568: variable 'ansible_search_path' from source: unknown 34350 1726853751.23573: variable 'ansible_search_path' from source: unknown 34350 1726853751.23576: calling self._execute() 34350 1726853751.23579: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.23581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.23584: variable 'omit' from source: magic vars 34350 1726853751.24545: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.24562: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.24953: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.24956: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.24959: when evaluation is False, skipping this task 34350 1726853751.24962: _execute() done 34350 1726853751.24964: dumping result to json 34350 1726853751.24966: done dumping result, returning 34350 1726853751.24969: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-b6c1-0de4-0000000000b9] 34350 1726853751.24974: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b9 34350 1726853751.25055: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000b9 34350 1726853751.25060: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.25118: no more pending results, returning what we have 34350 1726853751.25122: results queue empty 34350 1726853751.25123: checking for any_errors_fatal 34350 1726853751.25130: done checking for any_errors_fatal 34350 1726853751.25131: checking for max_fail_percentage 34350 1726853751.25133: done checking for max_fail_percentage 34350 1726853751.25135: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.25136: done checking to see if all hosts have failed 34350 1726853751.25136: getting the remaining hosts for this loop 34350 1726853751.25138: done getting the remaining hosts for this loop 34350 1726853751.25142: getting the next task for host managed_node1 34350 1726853751.25153: done getting next task for host managed_node1 34350 1726853751.25272: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34350 1726853751.25276: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853751.25301: getting variables 34350 1726853751.25303: in VariableManager get_vars() 34350 1726853751.25364: Calling all_inventory to load vars for managed_node1 34350 1726853751.25368: Calling groups_inventory to load vars for managed_node1 34350 1726853751.25576: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.25588: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.25591: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.25594: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.26198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.26844: done with get_vars() 34350 1726853751.26856: done getting variables 34350 1726853751.26917: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:35:51 -0400 (0:00:00.053) 0:00:05.144 ****** 34350 1726853751.27355: entering _queue_task() for managed_node1/package 34350 1726853751.28390: worker is 1 (out of 1 available) 34350 1726853751.28404: exiting _queue_task() for managed_node1/package 34350 1726853751.28415: done queuing things up, now waiting for results queue to drain 34350 1726853751.28417: waiting for pending results... 34350 1726853751.29291: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34350 1726853751.29295: in run() - task 02083763-bbaf-b6c1-0de4-0000000000ba 34350 1726853751.29524: variable 'ansible_search_path' from source: unknown 34350 1726853751.29528: variable 'ansible_search_path' from source: unknown 34350 1726853751.29755: calling self._execute() 34350 1726853751.29882: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.29982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.30076: variable 'omit' from source: magic vars 34350 1726853751.30527: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.30548: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.30675: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.30686: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.30693: when evaluation is False, skipping this task 34350 1726853751.30718: _execute() done 34350 1726853751.30721: dumping result to json 34350 1726853751.30723: done dumping result, returning 34350 1726853751.30726: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-b6c1-0de4-0000000000ba] 34350 1726853751.30732: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000ba skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.30981: no more pending results, returning what we have 34350 1726853751.30985: results queue empty 34350 1726853751.30986: checking for any_errors_fatal 34350 1726853751.30996: done checking for any_errors_fatal 34350 1726853751.30996: checking for max_fail_percentage 34350 1726853751.30998: done checking for max_fail_percentage 34350 1726853751.30999: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.31000: done checking to see if all hosts have failed 34350 1726853751.31001: getting the remaining hosts for this loop 34350 1726853751.31002: done getting the remaining hosts for this loop 34350 1726853751.31006: getting the next task for host managed_node1 34350 1726853751.31014: done getting next task for host managed_node1 34350 1726853751.31018: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34350 1726853751.31021: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853751.31050: getting variables 34350 1726853751.31052: in VariableManager get_vars() 34350 1726853751.31103: Calling all_inventory to load vars for managed_node1 34350 1726853751.31107: Calling groups_inventory to load vars for managed_node1 34350 1726853751.31109: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.31121: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.31124: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.31127: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.31495: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000ba 34350 1726853751.31499: WORKER PROCESS EXITING 34350 1726853751.31520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.31749: done with get_vars() 34350 1726853751.31844: done getting variables 34350 1726853751.32017: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:35:51 -0400 (0:00:00.046) 0:00:05.193 ****** 34350 1726853751.32053: entering _queue_task() for managed_node1/service 34350 1726853751.32969: worker is 1 (out of 1 available) 34350 1726853751.32987: exiting _queue_task() for managed_node1/service 34350 1726853751.33223: done queuing things up, now waiting for results queue to drain 34350 1726853751.33225: waiting for pending results... 34350 1726853751.33768: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34350 1726853751.33919: in run() - task 02083763-bbaf-b6c1-0de4-0000000000bb 34350 1726853751.33996: variable 'ansible_search_path' from source: unknown 34350 1726853751.34000: variable 'ansible_search_path' from source: unknown 34350 1726853751.34028: calling self._execute() 34350 1726853751.34137: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.34197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.34201: variable 'omit' from source: magic vars 34350 1726853751.34980: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.34984: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.35092: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.35104: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.35111: when evaluation is False, skipping this task 34350 1726853751.35118: _execute() done 34350 1726853751.35124: dumping result to json 34350 1726853751.35132: done dumping result, returning 34350 1726853751.35150: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-b6c1-0de4-0000000000bb] 34350 1726853751.35162: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000bb skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.35386: no more pending results, returning what we have 34350 1726853751.35390: results queue empty 34350 1726853751.35391: checking for any_errors_fatal 34350 1726853751.35397: done checking for any_errors_fatal 34350 1726853751.35397: checking for max_fail_percentage 34350 1726853751.35399: done checking for max_fail_percentage 34350 1726853751.35400: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.35401: done checking to see if all hosts have failed 34350 1726853751.35402: getting the remaining hosts for this loop 34350 1726853751.35403: done getting the remaining hosts for this loop 34350 1726853751.35407: getting the next task for host managed_node1 34350 1726853751.35415: done getting next task for host managed_node1 34350 1726853751.35418: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34350 1726853751.35421: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853751.35559: getting variables 34350 1726853751.35562: in VariableManager get_vars() 34350 1726853751.35606: Calling all_inventory to load vars for managed_node1 34350 1726853751.35609: Calling groups_inventory to load vars for managed_node1 34350 1726853751.35611: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.35621: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.35623: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.35626: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.36278: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000bb 34350 1726853751.36282: WORKER PROCESS EXITING 34350 1726853751.36695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.37379: done with get_vars() 34350 1726853751.37391: done getting variables 34350 1726853751.37444: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:35:51 -0400 (0:00:00.054) 0:00:05.247 ****** 34350 1726853751.37538: entering _queue_task() for managed_node1/service 34350 1726853751.38240: worker is 1 (out of 1 available) 34350 1726853751.38253: exiting _queue_task() for managed_node1/service 34350 1726853751.38265: done queuing things up, now waiting for results queue to drain 34350 1726853751.38266: waiting for pending results... 34350 1726853751.38866: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34350 1726853751.39140: in run() - task 02083763-bbaf-b6c1-0de4-0000000000bc 34350 1726853751.39165: variable 'ansible_search_path' from source: unknown 34350 1726853751.39186: variable 'ansible_search_path' from source: unknown 34350 1726853751.39326: calling self._execute() 34350 1726853751.39579: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.39593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.39616: variable 'omit' from source: magic vars 34350 1726853751.40352: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.40438: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.40811: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.40814: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.40817: when evaluation is False, skipping this task 34350 1726853751.40820: _execute() done 34350 1726853751.40822: dumping result to json 34350 1726853751.40823: done dumping result, returning 34350 1726853751.40825: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-b6c1-0de4-0000000000bc] 34350 1726853751.40827: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000bc 34350 1726853751.41079: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000bc 34350 1726853751.41083: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34350 1726853751.41133: no more pending results, returning what we have 34350 1726853751.41137: results queue empty 34350 1726853751.41138: checking for any_errors_fatal 34350 1726853751.41146: done checking for any_errors_fatal 34350 1726853751.41147: checking for max_fail_percentage 34350 1726853751.41149: done checking for max_fail_percentage 34350 1726853751.41150: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.41151: done checking to see if all hosts have failed 34350 1726853751.41152: getting the remaining hosts for this loop 34350 1726853751.41153: done getting the remaining hosts for this loop 34350 1726853751.41157: getting the next task for host managed_node1 34350 1726853751.41165: done getting next task for host managed_node1 34350 1726853751.41170: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34350 1726853751.41174: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853751.41195: getting variables 34350 1726853751.41197: in VariableManager get_vars() 34350 1726853751.41243: Calling all_inventory to load vars for managed_node1 34350 1726853751.41246: Calling groups_inventory to load vars for managed_node1 34350 1726853751.41249: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.41260: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.41264: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.41267: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.41446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.41852: done with get_vars() 34350 1726853751.41862: done getting variables 34350 1726853751.42224: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:35:51 -0400 (0:00:00.047) 0:00:05.295 ****** 34350 1726853751.42257: entering _queue_task() for managed_node1/service 34350 1726853751.42741: worker is 1 (out of 1 available) 34350 1726853751.42755: exiting _queue_task() for managed_node1/service 34350 1726853751.42767: done queuing things up, now waiting for results queue to drain 34350 1726853751.42768: waiting for pending results... 34350 1726853751.43668: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34350 1726853751.43676: in run() - task 02083763-bbaf-b6c1-0de4-0000000000bd 34350 1726853751.43680: variable 'ansible_search_path' from source: unknown 34350 1726853751.43682: variable 'ansible_search_path' from source: unknown 34350 1726853751.43801: calling self._execute() 34350 1726853751.43964: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.44176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.44180: variable 'omit' from source: magic vars 34350 1726853751.44739: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.44875: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.45113: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.45295: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.45298: when evaluation is False, skipping this task 34350 1726853751.45301: _execute() done 34350 1726853751.45303: dumping result to json 34350 1726853751.45305: done dumping result, returning 34350 1726853751.45307: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-b6c1-0de4-0000000000bd] 34350 1726853751.45309: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000bd 34350 1726853751.45385: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000bd 34350 1726853751.45388: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.45437: no more pending results, returning what we have 34350 1726853751.45442: results queue empty 34350 1726853751.45443: checking for any_errors_fatal 34350 1726853751.45449: done checking for any_errors_fatal 34350 1726853751.45450: checking for max_fail_percentage 34350 1726853751.45452: done checking for max_fail_percentage 34350 1726853751.45452: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.45453: done checking to see if all hosts have failed 34350 1726853751.45454: getting the remaining hosts for this loop 34350 1726853751.45455: done getting the remaining hosts for this loop 34350 1726853751.45461: getting the next task for host managed_node1 34350 1726853751.45469: done getting next task for host managed_node1 34350 1726853751.45475: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34350 1726853751.45477: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853751.45501: getting variables 34350 1726853751.45503: in VariableManager get_vars() 34350 1726853751.45553: Calling all_inventory to load vars for managed_node1 34350 1726853751.45557: Calling groups_inventory to load vars for managed_node1 34350 1726853751.45562: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.45781: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.45785: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.45789: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.46418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.46848: done with get_vars() 34350 1726853751.46977: done getting variables 34350 1726853751.47037: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:35:51 -0400 (0:00:00.048) 0:00:05.343 ****** 34350 1726853751.47174: entering _queue_task() for managed_node1/service 34350 1726853751.47728: worker is 1 (out of 1 available) 34350 1726853751.47856: exiting _queue_task() for managed_node1/service 34350 1726853751.47869: done queuing things up, now waiting for results queue to drain 34350 1726853751.47872: waiting for pending results... 34350 1726853751.48393: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 34350 1726853751.48400: in run() - task 02083763-bbaf-b6c1-0de4-0000000000be 34350 1726853751.48403: variable 'ansible_search_path' from source: unknown 34350 1726853751.48405: variable 'ansible_search_path' from source: unknown 34350 1726853751.48613: calling self._execute() 34350 1726853751.48694: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.48697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.48711: variable 'omit' from source: magic vars 34350 1726853751.49292: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.49367: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.49610: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.49615: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.49618: when evaluation is False, skipping this task 34350 1726853751.49622: _execute() done 34350 1726853751.49624: dumping result to json 34350 1726853751.49628: done dumping result, returning 34350 1726853751.49636: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-b6c1-0de4-0000000000be] 34350 1726853751.49641: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000be 34350 1726853751.49914: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000be 34350 1726853751.49918: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34350 1726853751.49970: no more pending results, returning what we have 34350 1726853751.49977: results queue empty 34350 1726853751.49978: checking for any_errors_fatal 34350 1726853751.49988: done checking for any_errors_fatal 34350 1726853751.49989: checking for max_fail_percentage 34350 1726853751.49991: done checking for max_fail_percentage 34350 1726853751.49992: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.49992: done checking to see if all hosts have failed 34350 1726853751.49993: getting the remaining hosts for this loop 34350 1726853751.49995: done getting the remaining hosts for this loop 34350 1726853751.49999: getting the next task for host managed_node1 34350 1726853751.50008: done getting next task for host managed_node1 34350 1726853751.50012: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34350 1726853751.50020: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853751.50044: getting variables 34350 1726853751.50046: in VariableManager get_vars() 34350 1726853751.50106: Calling all_inventory to load vars for managed_node1 34350 1726853751.50109: Calling groups_inventory to load vars for managed_node1 34350 1726853751.50111: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.50122: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.50246: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.50251: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.50437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.50704: done with get_vars() 34350 1726853751.50715: done getting variables 34350 1726853751.50777: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:35:51 -0400 (0:00:00.037) 0:00:05.380 ****** 34350 1726853751.50819: entering _queue_task() for managed_node1/copy 34350 1726853751.51168: worker is 1 (out of 1 available) 34350 1726853751.51183: exiting _queue_task() for managed_node1/copy 34350 1726853751.51195: done queuing things up, now waiting for results queue to drain 34350 1726853751.51196: waiting for pending results... 34350 1726853751.51705: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34350 1726853751.51779: in run() - task 02083763-bbaf-b6c1-0de4-0000000000bf 34350 1726853751.51806: variable 'ansible_search_path' from source: unknown 34350 1726853751.51814: variable 'ansible_search_path' from source: unknown 34350 1726853751.51824: calling self._execute() 34350 1726853751.51980: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.51990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.51993: variable 'omit' from source: magic vars 34350 1726853751.52304: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.52322: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.52430: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.52442: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.52450: when evaluation is False, skipping this task 34350 1726853751.52457: _execute() done 34350 1726853751.52465: dumping result to json 34350 1726853751.52476: done dumping result, returning 34350 1726853751.52489: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-b6c1-0de4-0000000000bf] 34350 1726853751.52500: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000bf 34350 1726853751.52648: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000bf 34350 1726853751.52651: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.52726: no more pending results, returning what we have 34350 1726853751.52730: results queue empty 34350 1726853751.52730: checking for any_errors_fatal 34350 1726853751.52736: done checking for any_errors_fatal 34350 1726853751.52737: checking for max_fail_percentage 34350 1726853751.52738: done checking for max_fail_percentage 34350 1726853751.52739: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.52740: done checking to see if all hosts have failed 34350 1726853751.52741: getting the remaining hosts for this loop 34350 1726853751.52742: done getting the remaining hosts for this loop 34350 1726853751.52745: getting the next task for host managed_node1 34350 1726853751.52867: done getting next task for host managed_node1 34350 1726853751.52872: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34350 1726853751.52875: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853751.52890: getting variables 34350 1726853751.52892: in VariableManager get_vars() 34350 1726853751.52934: Calling all_inventory to load vars for managed_node1 34350 1726853751.52937: Calling groups_inventory to load vars for managed_node1 34350 1726853751.52939: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.52948: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.52951: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.52954: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.53340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.53735: done with get_vars() 34350 1726853751.53744: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:35:51 -0400 (0:00:00.030) 0:00:05.411 ****** 34350 1726853751.53839: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34350 1726853751.54126: worker is 1 (out of 1 available) 34350 1726853751.54138: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34350 1726853751.54149: done queuing things up, now waiting for results queue to drain 34350 1726853751.54150: waiting for pending results... 34350 1726853751.54486: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34350 1726853751.54653: in run() - task 02083763-bbaf-b6c1-0de4-0000000000c0 34350 1726853751.54877: variable 'ansible_search_path' from source: unknown 34350 1726853751.54881: variable 'ansible_search_path' from source: unknown 34350 1726853751.54883: calling self._execute() 34350 1726853751.55028: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.55034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.55045: variable 'omit' from source: magic vars 34350 1726853751.55925: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.55938: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.56081: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.56103: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.56107: when evaluation is False, skipping this task 34350 1726853751.56110: _execute() done 34350 1726853751.56112: dumping result to json 34350 1726853751.56117: done dumping result, returning 34350 1726853751.56127: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-b6c1-0de4-0000000000c0] 34350 1726853751.56132: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000c0 34350 1726853751.56238: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000c0 34350 1726853751.56242: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.56297: no more pending results, returning what we have 34350 1726853751.56385: results queue empty 34350 1726853751.56386: checking for any_errors_fatal 34350 1726853751.56394: done checking for any_errors_fatal 34350 1726853751.56395: checking for max_fail_percentage 34350 1726853751.56397: done checking for max_fail_percentage 34350 1726853751.56398: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.56399: done checking to see if all hosts have failed 34350 1726853751.56399: getting the remaining hosts for this loop 34350 1726853751.56402: done getting the remaining hosts for this loop 34350 1726853751.56405: getting the next task for host managed_node1 34350 1726853751.56482: done getting next task for host managed_node1 34350 1726853751.56486: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34350 1726853751.56489: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853751.56506: getting variables 34350 1726853751.56508: in VariableManager get_vars() 34350 1726853751.56675: Calling all_inventory to load vars for managed_node1 34350 1726853751.56679: Calling groups_inventory to load vars for managed_node1 34350 1726853751.56681: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.56690: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.56699: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.56703: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.56949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.57183: done with get_vars() 34350 1726853751.57200: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:35:51 -0400 (0:00:00.034) 0:00:05.445 ****** 34350 1726853751.57323: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34350 1726853751.57627: worker is 1 (out of 1 available) 34350 1726853751.57639: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34350 1726853751.57660: done queuing things up, now waiting for results queue to drain 34350 1726853751.57662: waiting for pending results... 34350 1726853751.58189: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 34350 1726853751.58195: in run() - task 02083763-bbaf-b6c1-0de4-0000000000c1 34350 1726853751.58199: variable 'ansible_search_path' from source: unknown 34350 1726853751.58201: variable 'ansible_search_path' from source: unknown 34350 1726853751.58204: calling self._execute() 34350 1726853751.58257: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.58266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.58279: variable 'omit' from source: magic vars 34350 1726853751.58716: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.58727: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.58869: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.58876: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.58879: when evaluation is False, skipping this task 34350 1726853751.58882: _execute() done 34350 1726853751.58884: dumping result to json 34350 1726853751.58889: done dumping result, returning 34350 1726853751.58903: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-b6c1-0de4-0000000000c1] 34350 1726853751.58906: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000c1 34350 1726853751.59001: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000c1 34350 1726853751.59003: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.59101: no more pending results, returning what we have 34350 1726853751.59105: results queue empty 34350 1726853751.59106: checking for any_errors_fatal 34350 1726853751.59115: done checking for any_errors_fatal 34350 1726853751.59116: checking for max_fail_percentage 34350 1726853751.59117: done checking for max_fail_percentage 34350 1726853751.59118: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.59119: done checking to see if all hosts have failed 34350 1726853751.59120: getting the remaining hosts for this loop 34350 1726853751.59121: done getting the remaining hosts for this loop 34350 1726853751.59125: getting the next task for host managed_node1 34350 1726853751.59133: done getting next task for host managed_node1 34350 1726853751.59137: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34350 1726853751.59141: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853751.59165: getting variables 34350 1726853751.59167: in VariableManager get_vars() 34350 1726853751.59218: Calling all_inventory to load vars for managed_node1 34350 1726853751.59221: Calling groups_inventory to load vars for managed_node1 34350 1726853751.59224: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.59236: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.59239: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.59242: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.59665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.59891: done with get_vars() 34350 1726853751.59901: done getting variables 34350 1726853751.59957: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:35:51 -0400 (0:00:00.026) 0:00:05.472 ****** 34350 1726853751.59994: entering _queue_task() for managed_node1/debug 34350 1726853751.60311: worker is 1 (out of 1 available) 34350 1726853751.60324: exiting _queue_task() for managed_node1/debug 34350 1726853751.60334: done queuing things up, now waiting for results queue to drain 34350 1726853751.60335: waiting for pending results... 34350 1726853751.60681: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34350 1726853751.60787: in run() - task 02083763-bbaf-b6c1-0de4-0000000000c2 34350 1726853751.60819: variable 'ansible_search_path' from source: unknown 34350 1726853751.60822: variable 'ansible_search_path' from source: unknown 34350 1726853751.60976: calling self._execute() 34350 1726853751.60985: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.60992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.61001: variable 'omit' from source: magic vars 34350 1726853751.61427: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.61438: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.61565: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.61572: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.61576: when evaluation is False, skipping this task 34350 1726853751.61579: _execute() done 34350 1726853751.61581: dumping result to json 34350 1726853751.61585: done dumping result, returning 34350 1726853751.61594: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-b6c1-0de4-0000000000c2] 34350 1726853751.61599: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000c2 34350 1726853751.61820: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000c2 34350 1726853751.61823: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853751.61866: no more pending results, returning what we have 34350 1726853751.61870: results queue empty 34350 1726853751.61873: checking for any_errors_fatal 34350 1726853751.61877: done checking for any_errors_fatal 34350 1726853751.61878: checking for max_fail_percentage 34350 1726853751.61879: done checking for max_fail_percentage 34350 1726853751.61880: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.61881: done checking to see if all hosts have failed 34350 1726853751.61882: getting the remaining hosts for this loop 34350 1726853751.61883: done getting the remaining hosts for this loop 34350 1726853751.61886: getting the next task for host managed_node1 34350 1726853751.61892: done getting next task for host managed_node1 34350 1726853751.61896: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34350 1726853751.61899: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853751.61916: getting variables 34350 1726853751.61917: in VariableManager get_vars() 34350 1726853751.62283: Calling all_inventory to load vars for managed_node1 34350 1726853751.62287: Calling groups_inventory to load vars for managed_node1 34350 1726853751.62290: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.62300: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.62303: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.62306: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.62608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.62940: done with get_vars() 34350 1726853751.62951: done getting variables 34350 1726853751.63005: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:35:51 -0400 (0:00:00.030) 0:00:05.503 ****** 34350 1726853751.63049: entering _queue_task() for managed_node1/debug 34350 1726853751.63334: worker is 1 (out of 1 available) 34350 1726853751.63345: exiting _queue_task() for managed_node1/debug 34350 1726853751.63361: done queuing things up, now waiting for results queue to drain 34350 1726853751.63363: waiting for pending results... 34350 1726853751.63612: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34350 1726853751.63778: in run() - task 02083763-bbaf-b6c1-0de4-0000000000c3 34350 1726853751.63789: variable 'ansible_search_path' from source: unknown 34350 1726853751.63792: variable 'ansible_search_path' from source: unknown 34350 1726853751.63794: calling self._execute() 34350 1726853751.63895: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.63899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.63902: variable 'omit' from source: magic vars 34350 1726853751.64197: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.64207: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.64316: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.64325: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.64328: when evaluation is False, skipping this task 34350 1726853751.64331: _execute() done 34350 1726853751.64334: dumping result to json 34350 1726853751.64375: done dumping result, returning 34350 1726853751.64378: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-b6c1-0de4-0000000000c3] 34350 1726853751.64382: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000c3 skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853751.64577: no more pending results, returning what we have 34350 1726853751.64581: results queue empty 34350 1726853751.64582: checking for any_errors_fatal 34350 1726853751.64586: done checking for any_errors_fatal 34350 1726853751.64587: checking for max_fail_percentage 34350 1726853751.64588: done checking for max_fail_percentage 34350 1726853751.64589: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.64589: done checking to see if all hosts have failed 34350 1726853751.64590: getting the remaining hosts for this loop 34350 1726853751.64591: done getting the remaining hosts for this loop 34350 1726853751.64594: getting the next task for host managed_node1 34350 1726853751.64599: done getting next task for host managed_node1 34350 1726853751.64602: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34350 1726853751.64604: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853751.64619: getting variables 34350 1726853751.64621: in VariableManager get_vars() 34350 1726853751.64679: Calling all_inventory to load vars for managed_node1 34350 1726853751.64681: Calling groups_inventory to load vars for managed_node1 34350 1726853751.64682: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.64690: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.64692: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.64695: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.64962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.65177: done with get_vars() 34350 1726853751.65188: done getting variables 34350 1726853751.65222: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000c3 34350 1726853751.65225: WORKER PROCESS EXITING 34350 1726853751.65256: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:35:51 -0400 (0:00:00.022) 0:00:05.525 ****** 34350 1726853751.65291: entering _queue_task() for managed_node1/debug 34350 1726853751.65602: worker is 1 (out of 1 available) 34350 1726853751.65619: exiting _queue_task() for managed_node1/debug 34350 1726853751.65633: done queuing things up, now waiting for results queue to drain 34350 1726853751.65635: waiting for pending results... 34350 1726853751.65861: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34350 1726853751.65944: in run() - task 02083763-bbaf-b6c1-0de4-0000000000c4 34350 1726853751.65956: variable 'ansible_search_path' from source: unknown 34350 1726853751.65960: variable 'ansible_search_path' from source: unknown 34350 1726853751.65997: calling self._execute() 34350 1726853751.66057: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.66060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.66074: variable 'omit' from source: magic vars 34350 1726853751.66347: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.66356: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.66438: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.66442: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.66445: when evaluation is False, skipping this task 34350 1726853751.66450: _execute() done 34350 1726853751.66453: dumping result to json 34350 1726853751.66455: done dumping result, returning 34350 1726853751.66466: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-b6c1-0de4-0000000000c4] 34350 1726853751.66469: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000c4 34350 1726853751.66551: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000c4 34350 1726853751.66553: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853751.66601: no more pending results, returning what we have 34350 1726853751.66604: results queue empty 34350 1726853751.66605: checking for any_errors_fatal 34350 1726853751.66612: done checking for any_errors_fatal 34350 1726853751.66612: checking for max_fail_percentage 34350 1726853751.66614: done checking for max_fail_percentage 34350 1726853751.66615: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.66615: done checking to see if all hosts have failed 34350 1726853751.66616: getting the remaining hosts for this loop 34350 1726853751.66618: done getting the remaining hosts for this loop 34350 1726853751.66621: getting the next task for host managed_node1 34350 1726853751.66628: done getting next task for host managed_node1 34350 1726853751.66631: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34350 1726853751.66634: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853751.66652: getting variables 34350 1726853751.66653: in VariableManager get_vars() 34350 1726853751.66694: Calling all_inventory to load vars for managed_node1 34350 1726853751.66696: Calling groups_inventory to load vars for managed_node1 34350 1726853751.66698: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.66707: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.66709: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.66712: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.66829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.66948: done with get_vars() 34350 1726853751.66956: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:35:51 -0400 (0:00:00.017) 0:00:05.542 ****** 34350 1726853751.67021: entering _queue_task() for managed_node1/ping 34350 1726853751.67218: worker is 1 (out of 1 available) 34350 1726853751.67233: exiting _queue_task() for managed_node1/ping 34350 1726853751.67243: done queuing things up, now waiting for results queue to drain 34350 1726853751.67245: waiting for pending results... 34350 1726853751.67404: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 34350 1726853751.67481: in run() - task 02083763-bbaf-b6c1-0de4-0000000000c5 34350 1726853751.67493: variable 'ansible_search_path' from source: unknown 34350 1726853751.67496: variable 'ansible_search_path' from source: unknown 34350 1726853751.67523: calling self._execute() 34350 1726853751.67592: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.67595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.67605: variable 'omit' from source: magic vars 34350 1726853751.68176: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.68180: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.68183: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.68185: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.68188: when evaluation is False, skipping this task 34350 1726853751.68190: _execute() done 34350 1726853751.68192: dumping result to json 34350 1726853751.68194: done dumping result, returning 34350 1726853751.68201: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-b6c1-0de4-0000000000c5] 34350 1726853751.68203: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000c5 34350 1726853751.68262: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000c5 34350 1726853751.68264: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.68345: no more pending results, returning what we have 34350 1726853751.68348: results queue empty 34350 1726853751.68349: checking for any_errors_fatal 34350 1726853751.68355: done checking for any_errors_fatal 34350 1726853751.68355: checking for max_fail_percentage 34350 1726853751.68357: done checking for max_fail_percentage 34350 1726853751.68357: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.68361: done checking to see if all hosts have failed 34350 1726853751.68361: getting the remaining hosts for this loop 34350 1726853751.68363: done getting the remaining hosts for this loop 34350 1726853751.68367: getting the next task for host managed_node1 34350 1726853751.68377: done getting next task for host managed_node1 34350 1726853751.68379: ^ task is: TASK: meta (role_complete) 34350 1726853751.68381: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853751.68398: getting variables 34350 1726853751.68400: in VariableManager get_vars() 34350 1726853751.68438: Calling all_inventory to load vars for managed_node1 34350 1726853751.68440: Calling groups_inventory to load vars for managed_node1 34350 1726853751.68442: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.68450: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.68452: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.68455: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.68716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.68934: done with get_vars() 34350 1726853751.68944: done getting variables 34350 1726853751.69028: done queuing things up, now waiting for results queue to drain 34350 1726853751.69030: results queue empty 34350 1726853751.69030: checking for any_errors_fatal 34350 1726853751.69032: done checking for any_errors_fatal 34350 1726853751.69033: checking for max_fail_percentage 34350 1726853751.69034: done checking for max_fail_percentage 34350 1726853751.69035: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.69036: done checking to see if all hosts have failed 34350 1726853751.69036: getting the remaining hosts for this loop 34350 1726853751.69037: done getting the remaining hosts for this loop 34350 1726853751.69040: getting the next task for host managed_node1 34350 1726853751.69046: done getting next task for host managed_node1 34350 1726853751.69048: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34350 1726853751.69051: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853751.69062: getting variables 34350 1726853751.69063: in VariableManager get_vars() 34350 1726853751.69084: Calling all_inventory to load vars for managed_node1 34350 1726853751.69086: Calling groups_inventory to load vars for managed_node1 34350 1726853751.69088: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.69092: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.69095: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.69097: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.69248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.69512: done with get_vars() 34350 1726853751.69521: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:35:51 -0400 (0:00:00.025) 0:00:05.568 ****** 34350 1726853751.69577: entering _queue_task() for managed_node1/include_tasks 34350 1726853751.70031: worker is 1 (out of 1 available) 34350 1726853751.70040: exiting _queue_task() for managed_node1/include_tasks 34350 1726853751.70050: done queuing things up, now waiting for results queue to drain 34350 1726853751.70051: waiting for pending results... 34350 1726853751.70099: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 34350 1726853751.70233: in run() - task 02083763-bbaf-b6c1-0de4-0000000000fd 34350 1726853751.70250: variable 'ansible_search_path' from source: unknown 34350 1726853751.70254: variable 'ansible_search_path' from source: unknown 34350 1726853751.70295: calling self._execute() 34350 1726853751.70384: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.70388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.70400: variable 'omit' from source: magic vars 34350 1726853751.70808: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.70827: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.70945: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.70951: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.70954: when evaluation is False, skipping this task 34350 1726853751.70956: _execute() done 34350 1726853751.70961: dumping result to json 34350 1726853751.70964: done dumping result, returning 34350 1726853751.70975: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-b6c1-0de4-0000000000fd] 34350 1726853751.70978: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000fd 34350 1726853751.71067: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000fd 34350 1726853751.71069: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.71122: no more pending results, returning what we have 34350 1726853751.71126: results queue empty 34350 1726853751.71127: checking for any_errors_fatal 34350 1726853751.71128: done checking for any_errors_fatal 34350 1726853751.71129: checking for max_fail_percentage 34350 1726853751.71130: done checking for max_fail_percentage 34350 1726853751.71131: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.71132: done checking to see if all hosts have failed 34350 1726853751.71132: getting the remaining hosts for this loop 34350 1726853751.71134: done getting the remaining hosts for this loop 34350 1726853751.71137: getting the next task for host managed_node1 34350 1726853751.71145: done getting next task for host managed_node1 34350 1726853751.71149: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 34350 1726853751.71153: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853751.71178: getting variables 34350 1726853751.71180: in VariableManager get_vars() 34350 1726853751.71223: Calling all_inventory to load vars for managed_node1 34350 1726853751.71226: Calling groups_inventory to load vars for managed_node1 34350 1726853751.71228: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.71236: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.71238: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.71241: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.71661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.71854: done with get_vars() 34350 1726853751.71868: done getting variables 34350 1726853751.71924: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:35:51 -0400 (0:00:00.023) 0:00:05.592 ****** 34350 1726853751.71959: entering _queue_task() for managed_node1/debug 34350 1726853751.72234: worker is 1 (out of 1 available) 34350 1726853751.72247: exiting _queue_task() for managed_node1/debug 34350 1726853751.72257: done queuing things up, now waiting for results queue to drain 34350 1726853751.72259: waiting for pending results... 34350 1726853751.72541: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 34350 1726853751.72678: in run() - task 02083763-bbaf-b6c1-0de4-0000000000fe 34350 1726853751.72693: variable 'ansible_search_path' from source: unknown 34350 1726853751.72697: variable 'ansible_search_path' from source: unknown 34350 1726853751.72743: calling self._execute() 34350 1726853751.72878: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.72881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.72885: variable 'omit' from source: magic vars 34350 1726853751.73698: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.73712: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.73866: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.73870: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.73876: when evaluation is False, skipping this task 34350 1726853751.73879: _execute() done 34350 1726853751.73881: dumping result to json 34350 1726853751.73884: done dumping result, returning 34350 1726853751.73951: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-b6c1-0de4-0000000000fe] 34350 1726853751.73955: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000fe 34350 1726853751.74024: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000fe 34350 1726853751.74028: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853751.74076: no more pending results, returning what we have 34350 1726853751.74080: results queue empty 34350 1726853751.74081: checking for any_errors_fatal 34350 1726853751.74089: done checking for any_errors_fatal 34350 1726853751.74090: checking for max_fail_percentage 34350 1726853751.74091: done checking for max_fail_percentage 34350 1726853751.74092: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.74093: done checking to see if all hosts have failed 34350 1726853751.74094: getting the remaining hosts for this loop 34350 1726853751.74095: done getting the remaining hosts for this loop 34350 1726853751.74099: getting the next task for host managed_node1 34350 1726853751.74106: done getting next task for host managed_node1 34350 1726853751.74111: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34350 1726853751.74115: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853751.74134: getting variables 34350 1726853751.74136: in VariableManager get_vars() 34350 1726853751.74184: Calling all_inventory to load vars for managed_node1 34350 1726853751.74187: Calling groups_inventory to load vars for managed_node1 34350 1726853751.74189: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.74199: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.74202: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.74205: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.74684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.75113: done with get_vars() 34350 1726853751.75126: done getting variables 34350 1726853751.75390: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:35:51 -0400 (0:00:00.034) 0:00:05.626 ****** 34350 1726853751.75424: entering _queue_task() for managed_node1/fail 34350 1726853751.75749: worker is 1 (out of 1 available) 34350 1726853751.75762: exiting _queue_task() for managed_node1/fail 34350 1726853751.75975: done queuing things up, now waiting for results queue to drain 34350 1726853751.75976: waiting for pending results... 34350 1726853751.76203: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 34350 1726853751.76504: in run() - task 02083763-bbaf-b6c1-0de4-0000000000ff 34350 1726853751.76508: variable 'ansible_search_path' from source: unknown 34350 1726853751.76511: variable 'ansible_search_path' from source: unknown 34350 1726853751.76513: calling self._execute() 34350 1726853751.76678: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.76683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.76687: variable 'omit' from source: magic vars 34350 1726853751.77405: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.77416: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.77828: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.77832: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.77836: when evaluation is False, skipping this task 34350 1726853751.77839: _execute() done 34350 1726853751.77841: dumping result to json 34350 1726853751.77843: done dumping result, returning 34350 1726853751.77846: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-b6c1-0de4-0000000000ff] 34350 1726853751.77849: sending task result for task 02083763-bbaf-b6c1-0de4-0000000000ff 34350 1726853751.78232: done sending task result for task 02083763-bbaf-b6c1-0de4-0000000000ff 34350 1726853751.78237: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.78291: no more pending results, returning what we have 34350 1726853751.78296: results queue empty 34350 1726853751.78296: checking for any_errors_fatal 34350 1726853751.78303: done checking for any_errors_fatal 34350 1726853751.78304: checking for max_fail_percentage 34350 1726853751.78305: done checking for max_fail_percentage 34350 1726853751.78306: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.78306: done checking to see if all hosts have failed 34350 1726853751.78307: getting the remaining hosts for this loop 34350 1726853751.78309: done getting the remaining hosts for this loop 34350 1726853751.78312: getting the next task for host managed_node1 34350 1726853751.78320: done getting next task for host managed_node1 34350 1726853751.78324: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34350 1726853751.78329: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853751.78351: getting variables 34350 1726853751.78353: in VariableManager get_vars() 34350 1726853751.78407: Calling all_inventory to load vars for managed_node1 34350 1726853751.78410: Calling groups_inventory to load vars for managed_node1 34350 1726853751.78413: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.78425: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.78428: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.78431: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.78960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.79558: done with get_vars() 34350 1726853751.79689: done getting variables 34350 1726853751.79758: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:35:51 -0400 (0:00:00.044) 0:00:05.671 ****** 34350 1726853751.79904: entering _queue_task() for managed_node1/fail 34350 1726853751.80640: worker is 1 (out of 1 available) 34350 1726853751.80655: exiting _queue_task() for managed_node1/fail 34350 1726853751.80666: done queuing things up, now waiting for results queue to drain 34350 1726853751.80668: waiting for pending results... 34350 1726853751.81390: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 34350 1726853751.81482: in run() - task 02083763-bbaf-b6c1-0de4-000000000100 34350 1726853751.81745: variable 'ansible_search_path' from source: unknown 34350 1726853751.81749: variable 'ansible_search_path' from source: unknown 34350 1726853751.81752: calling self._execute() 34350 1726853751.82008: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.82013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.82292: variable 'omit' from source: magic vars 34350 1726853751.83069: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.83093: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.83215: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.83221: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.83224: when evaluation is False, skipping this task 34350 1726853751.83229: _execute() done 34350 1726853751.83237: dumping result to json 34350 1726853751.83239: done dumping result, returning 34350 1726853751.83248: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-b6c1-0de4-000000000100] 34350 1726853751.83251: sending task result for task 02083763-bbaf-b6c1-0de4-000000000100 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.83397: no more pending results, returning what we have 34350 1726853751.83401: results queue empty 34350 1726853751.83402: checking for any_errors_fatal 34350 1726853751.83407: done checking for any_errors_fatal 34350 1726853751.83408: checking for max_fail_percentage 34350 1726853751.83410: done checking for max_fail_percentage 34350 1726853751.83411: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.83412: done checking to see if all hosts have failed 34350 1726853751.83412: getting the remaining hosts for this loop 34350 1726853751.83414: done getting the remaining hosts for this loop 34350 1726853751.83419: getting the next task for host managed_node1 34350 1726853751.83428: done getting next task for host managed_node1 34350 1726853751.83431: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34350 1726853751.83437: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853751.83456: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000100 34350 1726853751.83468: getting variables 34350 1726853751.83472: in VariableManager get_vars() 34350 1726853751.83522: Calling all_inventory to load vars for managed_node1 34350 1726853751.83525: Calling groups_inventory to load vars for managed_node1 34350 1726853751.83528: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.83541: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.83544: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.83547: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.83931: WORKER PROCESS EXITING 34350 1726853751.83963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.84184: done with get_vars() 34350 1726853751.84195: done getting variables 34350 1726853751.84250: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:35:51 -0400 (0:00:00.043) 0:00:05.715 ****** 34350 1726853751.84284: entering _queue_task() for managed_node1/fail 34350 1726853751.84534: worker is 1 (out of 1 available) 34350 1726853751.84546: exiting _queue_task() for managed_node1/fail 34350 1726853751.84558: done queuing things up, now waiting for results queue to drain 34350 1726853751.84559: waiting for pending results... 34350 1726853751.84831: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 34350 1726853751.84948: in run() - task 02083763-bbaf-b6c1-0de4-000000000101 34350 1726853751.85047: variable 'ansible_search_path' from source: unknown 34350 1726853751.85051: variable 'ansible_search_path' from source: unknown 34350 1726853751.85054: calling self._execute() 34350 1726853751.85084: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.85090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.85104: variable 'omit' from source: magic vars 34350 1726853751.85465: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.85476: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.85593: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.85597: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.85599: when evaluation is False, skipping this task 34350 1726853751.85602: _execute() done 34350 1726853751.85604: dumping result to json 34350 1726853751.85606: done dumping result, returning 34350 1726853751.85613: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-b6c1-0de4-000000000101] 34350 1726853751.85616: sending task result for task 02083763-bbaf-b6c1-0de4-000000000101 34350 1726853751.85798: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000101 34350 1726853751.85802: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.85915: no more pending results, returning what we have 34350 1726853751.85918: results queue empty 34350 1726853751.85919: checking for any_errors_fatal 34350 1726853751.85924: done checking for any_errors_fatal 34350 1726853751.85924: checking for max_fail_percentage 34350 1726853751.85926: done checking for max_fail_percentage 34350 1726853751.85926: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.85927: done checking to see if all hosts have failed 34350 1726853751.85928: getting the remaining hosts for this loop 34350 1726853751.85929: done getting the remaining hosts for this loop 34350 1726853751.85932: getting the next task for host managed_node1 34350 1726853751.85938: done getting next task for host managed_node1 34350 1726853751.85942: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34350 1726853751.85946: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853751.85962: getting variables 34350 1726853751.85964: in VariableManager get_vars() 34350 1726853751.86007: Calling all_inventory to load vars for managed_node1 34350 1726853751.86010: Calling groups_inventory to load vars for managed_node1 34350 1726853751.86012: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.86020: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.86023: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.86026: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.86236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.86404: done with get_vars() 34350 1726853751.86412: done getting variables 34350 1726853751.86460: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:35:51 -0400 (0:00:00.022) 0:00:05.737 ****** 34350 1726853751.86490: entering _queue_task() for managed_node1/dnf 34350 1726853751.86756: worker is 1 (out of 1 available) 34350 1726853751.86770: exiting _queue_task() for managed_node1/dnf 34350 1726853751.86782: done queuing things up, now waiting for results queue to drain 34350 1726853751.86783: waiting for pending results... 34350 1726853751.87190: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 34350 1726853751.87195: in run() - task 02083763-bbaf-b6c1-0de4-000000000102 34350 1726853751.87214: variable 'ansible_search_path' from source: unknown 34350 1726853751.87223: variable 'ansible_search_path' from source: unknown 34350 1726853751.87266: calling self._execute() 34350 1726853751.87361: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.87376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.87393: variable 'omit' from source: magic vars 34350 1726853751.87777: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.87795: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.87915: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.87926: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.87933: when evaluation is False, skipping this task 34350 1726853751.87945: _execute() done 34350 1726853751.87953: dumping result to json 34350 1726853751.87975: done dumping result, returning 34350 1726853751.87979: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-b6c1-0de4-000000000102] 34350 1726853751.87983: sending task result for task 02083763-bbaf-b6c1-0de4-000000000102 34350 1726853751.88313: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000102 34350 1726853751.88316: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.88355: no more pending results, returning what we have 34350 1726853751.88358: results queue empty 34350 1726853751.88359: checking for any_errors_fatal 34350 1726853751.88364: done checking for any_errors_fatal 34350 1726853751.88365: checking for max_fail_percentage 34350 1726853751.88367: done checking for max_fail_percentage 34350 1726853751.88367: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.88369: done checking to see if all hosts have failed 34350 1726853751.88369: getting the remaining hosts for this loop 34350 1726853751.88372: done getting the remaining hosts for this loop 34350 1726853751.88375: getting the next task for host managed_node1 34350 1726853751.88382: done getting next task for host managed_node1 34350 1726853751.88385: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34350 1726853751.88390: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853751.88406: getting variables 34350 1726853751.88408: in VariableManager get_vars() 34350 1726853751.88450: Calling all_inventory to load vars for managed_node1 34350 1726853751.88453: Calling groups_inventory to load vars for managed_node1 34350 1726853751.88456: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.88465: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.88468: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.88473: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.88642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.88864: done with get_vars() 34350 1726853751.88876: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 34350 1726853751.88949: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:35:51 -0400 (0:00:00.024) 0:00:05.762 ****** 34350 1726853751.88981: entering _queue_task() for managed_node1/yum 34350 1726853751.89238: worker is 1 (out of 1 available) 34350 1726853751.89250: exiting _queue_task() for managed_node1/yum 34350 1726853751.89261: done queuing things up, now waiting for results queue to drain 34350 1726853751.89262: waiting for pending results... 34350 1726853751.89521: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 34350 1726853751.89646: in run() - task 02083763-bbaf-b6c1-0de4-000000000103 34350 1726853751.89664: variable 'ansible_search_path' from source: unknown 34350 1726853751.89674: variable 'ansible_search_path' from source: unknown 34350 1726853751.89716: calling self._execute() 34350 1726853751.89804: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.89814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.89827: variable 'omit' from source: magic vars 34350 1726853751.90208: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.90224: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.90342: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.90458: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.90461: when evaluation is False, skipping this task 34350 1726853751.90464: _execute() done 34350 1726853751.90466: dumping result to json 34350 1726853751.90468: done dumping result, returning 34350 1726853751.90473: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-b6c1-0de4-000000000103] 34350 1726853751.90476: sending task result for task 02083763-bbaf-b6c1-0de4-000000000103 34350 1726853751.90547: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000103 34350 1726853751.90550: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.90601: no more pending results, returning what we have 34350 1726853751.90605: results queue empty 34350 1726853751.90606: checking for any_errors_fatal 34350 1726853751.90611: done checking for any_errors_fatal 34350 1726853751.90612: checking for max_fail_percentage 34350 1726853751.90613: done checking for max_fail_percentage 34350 1726853751.90614: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.90615: done checking to see if all hosts have failed 34350 1726853751.90616: getting the remaining hosts for this loop 34350 1726853751.90617: done getting the remaining hosts for this loop 34350 1726853751.90621: getting the next task for host managed_node1 34350 1726853751.90629: done getting next task for host managed_node1 34350 1726853751.90633: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34350 1726853751.90638: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853751.90658: getting variables 34350 1726853751.90660: in VariableManager get_vars() 34350 1726853751.90712: Calling all_inventory to load vars for managed_node1 34350 1726853751.90715: Calling groups_inventory to load vars for managed_node1 34350 1726853751.90717: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.90729: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.90732: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.90734: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.91149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.91356: done with get_vars() 34350 1726853751.91366: done getting variables 34350 1726853751.91423: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:35:51 -0400 (0:00:00.024) 0:00:05.787 ****** 34350 1726853751.91456: entering _queue_task() for managed_node1/fail 34350 1726853751.91702: worker is 1 (out of 1 available) 34350 1726853751.91715: exiting _queue_task() for managed_node1/fail 34350 1726853751.91725: done queuing things up, now waiting for results queue to drain 34350 1726853751.91727: waiting for pending results... 34350 1726853751.91986: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 34350 1726853751.92176: in run() - task 02083763-bbaf-b6c1-0de4-000000000104 34350 1726853751.92180: variable 'ansible_search_path' from source: unknown 34350 1726853751.92184: variable 'ansible_search_path' from source: unknown 34350 1726853751.92187: calling self._execute() 34350 1726853751.92301: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.92304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.92307: variable 'omit' from source: magic vars 34350 1726853751.92663: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.92683: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.92801: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.92812: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.92844: when evaluation is False, skipping this task 34350 1726853751.92847: _execute() done 34350 1726853751.92850: dumping result to json 34350 1726853751.92852: done dumping result, returning 34350 1726853751.92855: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-b6c1-0de4-000000000104] 34350 1726853751.92858: sending task result for task 02083763-bbaf-b6c1-0de4-000000000104 34350 1726853751.93023: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000104 34350 1726853751.93027: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.93105: no more pending results, returning what we have 34350 1726853751.93109: results queue empty 34350 1726853751.93110: checking for any_errors_fatal 34350 1726853751.93115: done checking for any_errors_fatal 34350 1726853751.93116: checking for max_fail_percentage 34350 1726853751.93117: done checking for max_fail_percentage 34350 1726853751.93118: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.93119: done checking to see if all hosts have failed 34350 1726853751.93119: getting the remaining hosts for this loop 34350 1726853751.93121: done getting the remaining hosts for this loop 34350 1726853751.93124: getting the next task for host managed_node1 34350 1726853751.93132: done getting next task for host managed_node1 34350 1726853751.93135: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 34350 1726853751.93140: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853751.93158: getting variables 34350 1726853751.93160: in VariableManager get_vars() 34350 1726853751.93209: Calling all_inventory to load vars for managed_node1 34350 1726853751.93211: Calling groups_inventory to load vars for managed_node1 34350 1726853751.93214: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.93225: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.93227: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.93229: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.93608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.93814: done with get_vars() 34350 1726853751.93825: done getting variables 34350 1726853751.93887: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:35:51 -0400 (0:00:00.024) 0:00:05.811 ****** 34350 1726853751.93922: entering _queue_task() for managed_node1/package 34350 1726853751.94197: worker is 1 (out of 1 available) 34350 1726853751.94211: exiting _queue_task() for managed_node1/package 34350 1726853751.94223: done queuing things up, now waiting for results queue to drain 34350 1726853751.94224: waiting for pending results... 34350 1726853751.94597: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 34350 1726853751.94632: in run() - task 02083763-bbaf-b6c1-0de4-000000000105 34350 1726853751.94655: variable 'ansible_search_path' from source: unknown 34350 1726853751.94664: variable 'ansible_search_path' from source: unknown 34350 1726853751.94713: calling self._execute() 34350 1726853751.94809: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.94820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.94837: variable 'omit' from source: magic vars 34350 1726853751.95210: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.95226: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.95345: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.95360: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.95460: when evaluation is False, skipping this task 34350 1726853751.95463: _execute() done 34350 1726853751.95466: dumping result to json 34350 1726853751.95469: done dumping result, returning 34350 1726853751.95473: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-b6c1-0de4-000000000105] 34350 1726853751.95476: sending task result for task 02083763-bbaf-b6c1-0de4-000000000105 34350 1726853751.95547: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000105 34350 1726853751.95551: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.95602: no more pending results, returning what we have 34350 1726853751.95607: results queue empty 34350 1726853751.95608: checking for any_errors_fatal 34350 1726853751.95614: done checking for any_errors_fatal 34350 1726853751.95615: checking for max_fail_percentage 34350 1726853751.95617: done checking for max_fail_percentage 34350 1726853751.95617: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.95618: done checking to see if all hosts have failed 34350 1726853751.95619: getting the remaining hosts for this loop 34350 1726853751.95621: done getting the remaining hosts for this loop 34350 1726853751.95624: getting the next task for host managed_node1 34350 1726853751.95632: done getting next task for host managed_node1 34350 1726853751.95636: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34350 1726853751.95641: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853751.95661: getting variables 34350 1726853751.95663: in VariableManager get_vars() 34350 1726853751.95715: Calling all_inventory to load vars for managed_node1 34350 1726853751.95718: Calling groups_inventory to load vars for managed_node1 34350 1726853751.95721: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.95732: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.95734: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.95737: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.96143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.96319: done with get_vars() 34350 1726853751.96329: done getting variables 34350 1726853751.96384: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:35:51 -0400 (0:00:00.024) 0:00:05.836 ****** 34350 1726853751.96413: entering _queue_task() for managed_node1/package 34350 1726853751.96633: worker is 1 (out of 1 available) 34350 1726853751.96645: exiting _queue_task() for managed_node1/package 34350 1726853751.96655: done queuing things up, now waiting for results queue to drain 34350 1726853751.96656: waiting for pending results... 34350 1726853751.96995: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 34350 1726853751.97040: in run() - task 02083763-bbaf-b6c1-0de4-000000000106 34350 1726853751.97059: variable 'ansible_search_path' from source: unknown 34350 1726853751.97094: variable 'ansible_search_path' from source: unknown 34350 1726853751.97113: calling self._execute() 34350 1726853751.97201: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.97215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.97276: variable 'omit' from source: magic vars 34350 1726853751.97583: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.97598: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.97715: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.97725: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.97732: when evaluation is False, skipping this task 34350 1726853751.97739: _execute() done 34350 1726853751.97747: dumping result to json 34350 1726853751.97861: done dumping result, returning 34350 1726853751.97865: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-b6c1-0de4-000000000106] 34350 1726853751.97868: sending task result for task 02083763-bbaf-b6c1-0de4-000000000106 34350 1726853751.97936: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000106 34350 1726853751.97939: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853751.97984: no more pending results, returning what we have 34350 1726853751.97988: results queue empty 34350 1726853751.97988: checking for any_errors_fatal 34350 1726853751.97995: done checking for any_errors_fatal 34350 1726853751.97995: checking for max_fail_percentage 34350 1726853751.97997: done checking for max_fail_percentage 34350 1726853751.97998: checking to see if all hosts have failed and the running result is not ok 34350 1726853751.97998: done checking to see if all hosts have failed 34350 1726853751.97999: getting the remaining hosts for this loop 34350 1726853751.98001: done getting the remaining hosts for this loop 34350 1726853751.98004: getting the next task for host managed_node1 34350 1726853751.98012: done getting next task for host managed_node1 34350 1726853751.98015: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34350 1726853751.98020: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853751.98038: getting variables 34350 1726853751.98040: in VariableManager get_vars() 34350 1726853751.98087: Calling all_inventory to load vars for managed_node1 34350 1726853751.98091: Calling groups_inventory to load vars for managed_node1 34350 1726853751.98093: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853751.98104: Calling all_plugins_play to load vars for managed_node1 34350 1726853751.98107: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853751.98110: Calling groups_plugins_play to load vars for managed_node1 34350 1726853751.98415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853751.98621: done with get_vars() 34350 1726853751.98630: done getting variables 34350 1726853751.98684: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:35:51 -0400 (0:00:00.022) 0:00:05.859 ****** 34350 1726853751.98715: entering _queue_task() for managed_node1/package 34350 1726853751.98928: worker is 1 (out of 1 available) 34350 1726853751.98940: exiting _queue_task() for managed_node1/package 34350 1726853751.98951: done queuing things up, now waiting for results queue to drain 34350 1726853751.98952: waiting for pending results... 34350 1726853751.99385: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 34350 1726853751.99390: in run() - task 02083763-bbaf-b6c1-0de4-000000000107 34350 1726853751.99393: variable 'ansible_search_path' from source: unknown 34350 1726853751.99395: variable 'ansible_search_path' from source: unknown 34350 1726853751.99397: calling self._execute() 34350 1726853751.99470: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853751.99483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853751.99497: variable 'omit' from source: magic vars 34350 1726853751.99841: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.99856: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853751.99956: variable 'ansible_distribution_major_version' from source: facts 34350 1726853751.99966: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853751.99975: when evaluation is False, skipping this task 34350 1726853751.99981: _execute() done 34350 1726853751.99987: dumping result to json 34350 1726853751.99993: done dumping result, returning 34350 1726853752.00002: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-b6c1-0de4-000000000107] 34350 1726853752.00011: sending task result for task 02083763-bbaf-b6c1-0de4-000000000107 34350 1726853752.00277: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000107 34350 1726853752.00281: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853752.00319: no more pending results, returning what we have 34350 1726853752.00322: results queue empty 34350 1726853752.00323: checking for any_errors_fatal 34350 1726853752.00329: done checking for any_errors_fatal 34350 1726853752.00330: checking for max_fail_percentage 34350 1726853752.00331: done checking for max_fail_percentage 34350 1726853752.00332: checking to see if all hosts have failed and the running result is not ok 34350 1726853752.00333: done checking to see if all hosts have failed 34350 1726853752.00334: getting the remaining hosts for this loop 34350 1726853752.00335: done getting the remaining hosts for this loop 34350 1726853752.00339: getting the next task for host managed_node1 34350 1726853752.00345: done getting next task for host managed_node1 34350 1726853752.00349: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34350 1726853752.00353: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853752.00373: getting variables 34350 1726853752.00375: in VariableManager get_vars() 34350 1726853752.00418: Calling all_inventory to load vars for managed_node1 34350 1726853752.00421: Calling groups_inventory to load vars for managed_node1 34350 1726853752.00423: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853752.00432: Calling all_plugins_play to load vars for managed_node1 34350 1726853752.00435: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853752.00438: Calling groups_plugins_play to load vars for managed_node1 34350 1726853752.00712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853752.00908: done with get_vars() 34350 1726853752.00918: done getting variables 34350 1726853752.00976: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:35:52 -0400 (0:00:00.022) 0:00:05.882 ****** 34350 1726853752.01010: entering _queue_task() for managed_node1/service 34350 1726853752.01241: worker is 1 (out of 1 available) 34350 1726853752.01254: exiting _queue_task() for managed_node1/service 34350 1726853752.01265: done queuing things up, now waiting for results queue to drain 34350 1726853752.01266: waiting for pending results... 34350 1726853752.01526: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 34350 1726853752.01662: in run() - task 02083763-bbaf-b6c1-0de4-000000000108 34350 1726853752.01684: variable 'ansible_search_path' from source: unknown 34350 1726853752.01697: variable 'ansible_search_path' from source: unknown 34350 1726853752.01738: calling self._execute() 34350 1726853752.01837: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853752.02076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853752.02080: variable 'omit' from source: magic vars 34350 1726853752.02190: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.02209: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853752.02317: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.02327: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853752.02334: when evaluation is False, skipping this task 34350 1726853752.02341: _execute() done 34350 1726853752.02346: dumping result to json 34350 1726853752.02353: done dumping result, returning 34350 1726853752.02364: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-b6c1-0de4-000000000108] 34350 1726853752.02375: sending task result for task 02083763-bbaf-b6c1-0de4-000000000108 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853752.02617: no more pending results, returning what we have 34350 1726853752.02621: results queue empty 34350 1726853752.02622: checking for any_errors_fatal 34350 1726853752.02629: done checking for any_errors_fatal 34350 1726853752.02630: checking for max_fail_percentage 34350 1726853752.02631: done checking for max_fail_percentage 34350 1726853752.02632: checking to see if all hosts have failed and the running result is not ok 34350 1726853752.02633: done checking to see if all hosts have failed 34350 1726853752.02633: getting the remaining hosts for this loop 34350 1726853752.02636: done getting the remaining hosts for this loop 34350 1726853752.02639: getting the next task for host managed_node1 34350 1726853752.02647: done getting next task for host managed_node1 34350 1726853752.02650: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34350 1726853752.02655: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853752.02676: getting variables 34350 1726853752.02678: in VariableManager get_vars() 34350 1726853752.02722: Calling all_inventory to load vars for managed_node1 34350 1726853752.02725: Calling groups_inventory to load vars for managed_node1 34350 1726853752.02727: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853752.02739: Calling all_plugins_play to load vars for managed_node1 34350 1726853752.02742: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853752.02745: Calling groups_plugins_play to load vars for managed_node1 34350 1726853752.02991: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000108 34350 1726853752.02994: WORKER PROCESS EXITING 34350 1726853752.03015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853752.03218: done with get_vars() 34350 1726853752.03228: done getting variables 34350 1726853752.03283: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:35:52 -0400 (0:00:00.023) 0:00:05.905 ****** 34350 1726853752.03313: entering _queue_task() for managed_node1/service 34350 1726853752.03530: worker is 1 (out of 1 available) 34350 1726853752.03542: exiting _queue_task() for managed_node1/service 34350 1726853752.03554: done queuing things up, now waiting for results queue to drain 34350 1726853752.03555: waiting for pending results... 34350 1726853752.03806: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 34350 1726853752.03933: in run() - task 02083763-bbaf-b6c1-0de4-000000000109 34350 1726853752.03951: variable 'ansible_search_path' from source: unknown 34350 1726853752.03959: variable 'ansible_search_path' from source: unknown 34350 1726853752.04001: calling self._execute() 34350 1726853752.04085: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853752.04095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853752.04111: variable 'omit' from source: magic vars 34350 1726853752.04453: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.04469: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853752.04776: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.04780: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853752.04782: when evaluation is False, skipping this task 34350 1726853752.04785: _execute() done 34350 1726853752.04787: dumping result to json 34350 1726853752.04789: done dumping result, returning 34350 1726853752.04792: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-b6c1-0de4-000000000109] 34350 1726853752.04794: sending task result for task 02083763-bbaf-b6c1-0de4-000000000109 34350 1726853752.04852: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000109 34350 1726853752.04855: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34350 1726853752.04887: no more pending results, returning what we have 34350 1726853752.04889: results queue empty 34350 1726853752.04890: checking for any_errors_fatal 34350 1726853752.04895: done checking for any_errors_fatal 34350 1726853752.04896: checking for max_fail_percentage 34350 1726853752.04897: done checking for max_fail_percentage 34350 1726853752.04898: checking to see if all hosts have failed and the running result is not ok 34350 1726853752.04898: done checking to see if all hosts have failed 34350 1726853752.04899: getting the remaining hosts for this loop 34350 1726853752.04900: done getting the remaining hosts for this loop 34350 1726853752.04903: getting the next task for host managed_node1 34350 1726853752.04909: done getting next task for host managed_node1 34350 1726853752.04912: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34350 1726853752.04916: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853752.04933: getting variables 34350 1726853752.04934: in VariableManager get_vars() 34350 1726853752.04970: Calling all_inventory to load vars for managed_node1 34350 1726853752.04974: Calling groups_inventory to load vars for managed_node1 34350 1726853752.04976: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853752.04985: Calling all_plugins_play to load vars for managed_node1 34350 1726853752.04987: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853752.04990: Calling groups_plugins_play to load vars for managed_node1 34350 1726853752.05250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853752.05440: done with get_vars() 34350 1726853752.05450: done getting variables 34350 1726853752.05505: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:35:52 -0400 (0:00:00.022) 0:00:05.928 ****** 34350 1726853752.05533: entering _queue_task() for managed_node1/service 34350 1726853752.05749: worker is 1 (out of 1 available) 34350 1726853752.05764: exiting _queue_task() for managed_node1/service 34350 1726853752.05978: done queuing things up, now waiting for results queue to drain 34350 1726853752.05980: waiting for pending results... 34350 1726853752.06059: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 34350 1726853752.06206: in run() - task 02083763-bbaf-b6c1-0de4-00000000010a 34350 1726853752.06314: variable 'ansible_search_path' from source: unknown 34350 1726853752.06318: variable 'ansible_search_path' from source: unknown 34350 1726853752.06321: calling self._execute() 34350 1726853752.06358: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853752.06372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853752.06388: variable 'omit' from source: magic vars 34350 1726853752.06761: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.06780: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853752.06899: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.06911: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853752.06919: when evaluation is False, skipping this task 34350 1726853752.06926: _execute() done 34350 1726853752.06933: dumping result to json 34350 1726853752.06941: done dumping result, returning 34350 1726853752.06951: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-b6c1-0de4-00000000010a] 34350 1726853752.06960: sending task result for task 02083763-bbaf-b6c1-0de4-00000000010a skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853752.07217: no more pending results, returning what we have 34350 1726853752.07220: results queue empty 34350 1726853752.07221: checking for any_errors_fatal 34350 1726853752.07228: done checking for any_errors_fatal 34350 1726853752.07229: checking for max_fail_percentage 34350 1726853752.07230: done checking for max_fail_percentage 34350 1726853752.07231: checking to see if all hosts have failed and the running result is not ok 34350 1726853752.07232: done checking to see if all hosts have failed 34350 1726853752.07232: getting the remaining hosts for this loop 34350 1726853752.07234: done getting the remaining hosts for this loop 34350 1726853752.07238: getting the next task for host managed_node1 34350 1726853752.07245: done getting next task for host managed_node1 34350 1726853752.07249: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 34350 1726853752.07253: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853752.07275: getting variables 34350 1726853752.07277: in VariableManager get_vars() 34350 1726853752.07322: Calling all_inventory to load vars for managed_node1 34350 1726853752.07325: Calling groups_inventory to load vars for managed_node1 34350 1726853752.07327: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853752.07337: Calling all_plugins_play to load vars for managed_node1 34350 1726853752.07341: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853752.07344: Calling groups_plugins_play to load vars for managed_node1 34350 1726853752.07589: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000010a 34350 1726853752.07593: WORKER PROCESS EXITING 34350 1726853752.07615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853752.07813: done with get_vars() 34350 1726853752.07823: done getting variables 34350 1726853752.07881: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:35:52 -0400 (0:00:00.023) 0:00:05.951 ****** 34350 1726853752.07912: entering _queue_task() for managed_node1/service 34350 1726853752.08148: worker is 1 (out of 1 available) 34350 1726853752.08159: exiting _queue_task() for managed_node1/service 34350 1726853752.08373: done queuing things up, now waiting for results queue to drain 34350 1726853752.08375: waiting for pending results... 34350 1726853752.08431: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 34350 1726853752.08560: in run() - task 02083763-bbaf-b6c1-0de4-00000000010b 34350 1726853752.08581: variable 'ansible_search_path' from source: unknown 34350 1726853752.08590: variable 'ansible_search_path' from source: unknown 34350 1726853752.08633: calling self._execute() 34350 1726853752.08726: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853752.08736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853752.08750: variable 'omit' from source: magic vars 34350 1726853752.09075: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.09085: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853752.09169: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.09174: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853752.09177: when evaluation is False, skipping this task 34350 1726853752.09180: _execute() done 34350 1726853752.09184: dumping result to json 34350 1726853752.09186: done dumping result, returning 34350 1726853752.09195: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-b6c1-0de4-00000000010b] 34350 1726853752.09198: sending task result for task 02083763-bbaf-b6c1-0de4-00000000010b 34350 1726853752.09283: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000010b 34350 1726853752.09285: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 34350 1726853752.09327: no more pending results, returning what we have 34350 1726853752.09331: results queue empty 34350 1726853752.09331: checking for any_errors_fatal 34350 1726853752.09337: done checking for any_errors_fatal 34350 1726853752.09338: checking for max_fail_percentage 34350 1726853752.09339: done checking for max_fail_percentage 34350 1726853752.09340: checking to see if all hosts have failed and the running result is not ok 34350 1726853752.09341: done checking to see if all hosts have failed 34350 1726853752.09342: getting the remaining hosts for this loop 34350 1726853752.09343: done getting the remaining hosts for this loop 34350 1726853752.09346: getting the next task for host managed_node1 34350 1726853752.09352: done getting next task for host managed_node1 34350 1726853752.09356: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34350 1726853752.09359: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853752.09377: getting variables 34350 1726853752.09379: in VariableManager get_vars() 34350 1726853752.09417: Calling all_inventory to load vars for managed_node1 34350 1726853752.09419: Calling groups_inventory to load vars for managed_node1 34350 1726853752.09421: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853752.09428: Calling all_plugins_play to load vars for managed_node1 34350 1726853752.09431: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853752.09433: Calling groups_plugins_play to load vars for managed_node1 34350 1726853752.09624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853752.09740: done with get_vars() 34350 1726853752.09747: done getting variables 34350 1726853752.09790: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:35:52 -0400 (0:00:00.019) 0:00:05.970 ****** 34350 1726853752.09812: entering _queue_task() for managed_node1/copy 34350 1726853752.09989: worker is 1 (out of 1 available) 34350 1726853752.10002: exiting _queue_task() for managed_node1/copy 34350 1726853752.10011: done queuing things up, now waiting for results queue to drain 34350 1726853752.10012: waiting for pending results... 34350 1726853752.10162: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 34350 1726853752.10247: in run() - task 02083763-bbaf-b6c1-0de4-00000000010c 34350 1726853752.10258: variable 'ansible_search_path' from source: unknown 34350 1726853752.10264: variable 'ansible_search_path' from source: unknown 34350 1726853752.10293: calling self._execute() 34350 1726853752.10351: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853752.10354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853752.10360: variable 'omit' from source: magic vars 34350 1726853752.10611: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.10619: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853752.10712: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.10715: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853752.10724: when evaluation is False, skipping this task 34350 1726853752.10731: _execute() done 34350 1726853752.10734: dumping result to json 34350 1726853752.10737: done dumping result, returning 34350 1726853752.10740: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-b6c1-0de4-00000000010c] 34350 1726853752.10743: sending task result for task 02083763-bbaf-b6c1-0de4-00000000010c 34350 1726853752.11031: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000010c 34350 1726853752.11036: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853752.11107: no more pending results, returning what we have 34350 1726853752.11110: results queue empty 34350 1726853752.11111: checking for any_errors_fatal 34350 1726853752.11116: done checking for any_errors_fatal 34350 1726853752.11117: checking for max_fail_percentage 34350 1726853752.11118: done checking for max_fail_percentage 34350 1726853752.11119: checking to see if all hosts have failed and the running result is not ok 34350 1726853752.11120: done checking to see if all hosts have failed 34350 1726853752.11120: getting the remaining hosts for this loop 34350 1726853752.11122: done getting the remaining hosts for this loop 34350 1726853752.11124: getting the next task for host managed_node1 34350 1726853752.11130: done getting next task for host managed_node1 34350 1726853752.11133: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34350 1726853752.11136: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853752.11150: getting variables 34350 1726853752.11152: in VariableManager get_vars() 34350 1726853752.11189: Calling all_inventory to load vars for managed_node1 34350 1726853752.11192: Calling groups_inventory to load vars for managed_node1 34350 1726853752.11194: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853752.11202: Calling all_plugins_play to load vars for managed_node1 34350 1726853752.11205: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853752.11208: Calling groups_plugins_play to load vars for managed_node1 34350 1726853752.11381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853752.11765: done with get_vars() 34350 1726853752.11801: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:35:52 -0400 (0:00:00.020) 0:00:05.991 ****** 34350 1726853752.11895: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34350 1726853752.12154: worker is 1 (out of 1 available) 34350 1726853752.12178: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 34350 1726853752.12192: done queuing things up, now waiting for results queue to drain 34350 1726853752.12193: waiting for pending results... 34350 1726853752.12586: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 34350 1726853752.12599: in run() - task 02083763-bbaf-b6c1-0de4-00000000010d 34350 1726853752.12605: variable 'ansible_search_path' from source: unknown 34350 1726853752.12607: variable 'ansible_search_path' from source: unknown 34350 1726853752.12663: calling self._execute() 34350 1726853752.12762: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853752.12769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853752.12781: variable 'omit' from source: magic vars 34350 1726853752.13074: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.13083: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853752.13158: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.13165: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853752.13168: when evaluation is False, skipping this task 34350 1726853752.13172: _execute() done 34350 1726853752.13175: dumping result to json 34350 1726853752.13178: done dumping result, returning 34350 1726853752.13186: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-b6c1-0de4-00000000010d] 34350 1726853752.13191: sending task result for task 02083763-bbaf-b6c1-0de4-00000000010d 34350 1726853752.13276: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000010d 34350 1726853752.13279: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853752.13329: no more pending results, returning what we have 34350 1726853752.13332: results queue empty 34350 1726853752.13333: checking for any_errors_fatal 34350 1726853752.13337: done checking for any_errors_fatal 34350 1726853752.13338: checking for max_fail_percentage 34350 1726853752.13339: done checking for max_fail_percentage 34350 1726853752.13340: checking to see if all hosts have failed and the running result is not ok 34350 1726853752.13341: done checking to see if all hosts have failed 34350 1726853752.13341: getting the remaining hosts for this loop 34350 1726853752.13343: done getting the remaining hosts for this loop 34350 1726853752.13346: getting the next task for host managed_node1 34350 1726853752.13351: done getting next task for host managed_node1 34350 1726853752.13354: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 34350 1726853752.13358: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853752.13375: getting variables 34350 1726853752.13377: in VariableManager get_vars() 34350 1726853752.13417: Calling all_inventory to load vars for managed_node1 34350 1726853752.13420: Calling groups_inventory to load vars for managed_node1 34350 1726853752.13422: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853752.13430: Calling all_plugins_play to load vars for managed_node1 34350 1726853752.13432: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853752.13436: Calling groups_plugins_play to load vars for managed_node1 34350 1726853752.13574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853752.13690: done with get_vars() 34350 1726853752.13697: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:35:52 -0400 (0:00:00.018) 0:00:06.010 ****** 34350 1726853752.13766: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34350 1726853752.14002: worker is 1 (out of 1 available) 34350 1726853752.14014: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 34350 1726853752.14027: done queuing things up, now waiting for results queue to drain 34350 1726853752.14028: waiting for pending results... 34350 1726853752.14390: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 34350 1726853752.14433: in run() - task 02083763-bbaf-b6c1-0de4-00000000010e 34350 1726853752.14477: variable 'ansible_search_path' from source: unknown 34350 1726853752.14481: variable 'ansible_search_path' from source: unknown 34350 1726853752.14498: calling self._execute() 34350 1726853752.14581: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853752.14677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853752.14681: variable 'omit' from source: magic vars 34350 1726853752.15177: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.15181: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853752.15184: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.15186: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853752.15188: when evaluation is False, skipping this task 34350 1726853752.15191: _execute() done 34350 1726853752.15193: dumping result to json 34350 1726853752.15195: done dumping result, returning 34350 1726853752.15198: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-b6c1-0de4-00000000010e] 34350 1726853752.15200: sending task result for task 02083763-bbaf-b6c1-0de4-00000000010e 34350 1726853752.15260: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000010e 34350 1726853752.15264: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853752.15317: no more pending results, returning what we have 34350 1726853752.15321: results queue empty 34350 1726853752.15322: checking for any_errors_fatal 34350 1726853752.15328: done checking for any_errors_fatal 34350 1726853752.15328: checking for max_fail_percentage 34350 1726853752.15330: done checking for max_fail_percentage 34350 1726853752.15331: checking to see if all hosts have failed and the running result is not ok 34350 1726853752.15332: done checking to see if all hosts have failed 34350 1726853752.15332: getting the remaining hosts for this loop 34350 1726853752.15334: done getting the remaining hosts for this loop 34350 1726853752.15338: getting the next task for host managed_node1 34350 1726853752.15356: done getting next task for host managed_node1 34350 1726853752.15360: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34350 1726853752.15365: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853752.15387: getting variables 34350 1726853752.15389: in VariableManager get_vars() 34350 1726853752.15440: Calling all_inventory to load vars for managed_node1 34350 1726853752.15442: Calling groups_inventory to load vars for managed_node1 34350 1726853752.15445: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853752.15573: Calling all_plugins_play to load vars for managed_node1 34350 1726853752.15577: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853752.15580: Calling groups_plugins_play to load vars for managed_node1 34350 1726853752.15806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853752.15980: done with get_vars() 34350 1726853752.15990: done getting variables 34350 1726853752.16053: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:35:52 -0400 (0:00:00.023) 0:00:06.033 ****** 34350 1726853752.16086: entering _queue_task() for managed_node1/debug 34350 1726853752.16581: worker is 1 (out of 1 available) 34350 1726853752.16589: exiting _queue_task() for managed_node1/debug 34350 1726853752.16597: done queuing things up, now waiting for results queue to drain 34350 1726853752.16598: waiting for pending results... 34350 1726853752.16888: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 34350 1726853752.16894: in run() - task 02083763-bbaf-b6c1-0de4-00000000010f 34350 1726853752.16897: variable 'ansible_search_path' from source: unknown 34350 1726853752.16899: variable 'ansible_search_path' from source: unknown 34350 1726853752.16902: calling self._execute() 34350 1726853752.16904: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853752.16906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853752.16909: variable 'omit' from source: magic vars 34350 1726853752.17241: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.17252: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853752.17477: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.17480: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853752.17483: when evaluation is False, skipping this task 34350 1726853752.17486: _execute() done 34350 1726853752.17488: dumping result to json 34350 1726853752.17491: done dumping result, returning 34350 1726853752.17494: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-b6c1-0de4-00000000010f] 34350 1726853752.17496: sending task result for task 02083763-bbaf-b6c1-0de4-00000000010f 34350 1726853752.17561: done sending task result for task 02083763-bbaf-b6c1-0de4-00000000010f 34350 1726853752.17565: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853752.17611: no more pending results, returning what we have 34350 1726853752.17615: results queue empty 34350 1726853752.17616: checking for any_errors_fatal 34350 1726853752.17621: done checking for any_errors_fatal 34350 1726853752.17622: checking for max_fail_percentage 34350 1726853752.17623: done checking for max_fail_percentage 34350 1726853752.17624: checking to see if all hosts have failed and the running result is not ok 34350 1726853752.17625: done checking to see if all hosts have failed 34350 1726853752.17626: getting the remaining hosts for this loop 34350 1726853752.17627: done getting the remaining hosts for this loop 34350 1726853752.17631: getting the next task for host managed_node1 34350 1726853752.17638: done getting next task for host managed_node1 34350 1726853752.17643: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34350 1726853752.17647: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853752.17666: getting variables 34350 1726853752.17668: in VariableManager get_vars() 34350 1726853752.17714: Calling all_inventory to load vars for managed_node1 34350 1726853752.17717: Calling groups_inventory to load vars for managed_node1 34350 1726853752.17720: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853752.17732: Calling all_plugins_play to load vars for managed_node1 34350 1726853752.17735: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853752.17737: Calling groups_plugins_play to load vars for managed_node1 34350 1726853752.18070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853752.18277: done with get_vars() 34350 1726853752.18286: done getting variables 34350 1726853752.18344: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:35:52 -0400 (0:00:00.022) 0:00:06.056 ****** 34350 1726853752.18375: entering _queue_task() for managed_node1/debug 34350 1726853752.18620: worker is 1 (out of 1 available) 34350 1726853752.18633: exiting _queue_task() for managed_node1/debug 34350 1726853752.18756: done queuing things up, now waiting for results queue to drain 34350 1726853752.18758: waiting for pending results... 34350 1726853752.19088: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 34350 1726853752.19094: in run() - task 02083763-bbaf-b6c1-0de4-000000000110 34350 1726853752.19099: variable 'ansible_search_path' from source: unknown 34350 1726853752.19102: variable 'ansible_search_path' from source: unknown 34350 1726853752.19106: calling self._execute() 34350 1726853752.19276: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853752.19280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853752.19283: variable 'omit' from source: magic vars 34350 1726853752.19675: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.19680: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853752.19695: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.19701: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853752.19704: when evaluation is False, skipping this task 34350 1726853752.19707: _execute() done 34350 1726853752.19710: dumping result to json 34350 1726853752.19712: done dumping result, returning 34350 1726853752.19721: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-b6c1-0de4-000000000110] 34350 1726853752.19730: sending task result for task 02083763-bbaf-b6c1-0de4-000000000110 34350 1726853752.19820: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000110 34350 1726853752.19824: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853752.19881: no more pending results, returning what we have 34350 1726853752.19885: results queue empty 34350 1726853752.19886: checking for any_errors_fatal 34350 1726853752.19893: done checking for any_errors_fatal 34350 1726853752.19894: checking for max_fail_percentage 34350 1726853752.19896: done checking for max_fail_percentage 34350 1726853752.19897: checking to see if all hosts have failed and the running result is not ok 34350 1726853752.19897: done checking to see if all hosts have failed 34350 1726853752.19898: getting the remaining hosts for this loop 34350 1726853752.19900: done getting the remaining hosts for this loop 34350 1726853752.19903: getting the next task for host managed_node1 34350 1726853752.19911: done getting next task for host managed_node1 34350 1726853752.19915: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34350 1726853752.19919: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853752.19938: getting variables 34350 1726853752.20054: in VariableManager get_vars() 34350 1726853752.20095: Calling all_inventory to load vars for managed_node1 34350 1726853752.20097: Calling groups_inventory to load vars for managed_node1 34350 1726853752.20101: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853752.20110: Calling all_plugins_play to load vars for managed_node1 34350 1726853752.20112: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853752.20115: Calling groups_plugins_play to load vars for managed_node1 34350 1726853752.20340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853752.20552: done with get_vars() 34350 1726853752.20562: done getting variables 34350 1726853752.20620: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:35:52 -0400 (0:00:00.022) 0:00:06.079 ****** 34350 1726853752.20651: entering _queue_task() for managed_node1/debug 34350 1726853752.20887: worker is 1 (out of 1 available) 34350 1726853752.20897: exiting _queue_task() for managed_node1/debug 34350 1726853752.20907: done queuing things up, now waiting for results queue to drain 34350 1726853752.20908: waiting for pending results... 34350 1726853752.21287: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 34350 1726853752.21296: in run() - task 02083763-bbaf-b6c1-0de4-000000000111 34350 1726853752.21310: variable 'ansible_search_path' from source: unknown 34350 1726853752.21313: variable 'ansible_search_path' from source: unknown 34350 1726853752.21345: calling self._execute() 34350 1726853752.21676: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853752.21681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853752.21684: variable 'omit' from source: magic vars 34350 1726853752.21795: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.21813: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853752.22076: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.22080: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853752.22082: when evaluation is False, skipping this task 34350 1726853752.22084: _execute() done 34350 1726853752.22086: dumping result to json 34350 1726853752.22087: done dumping result, returning 34350 1726853752.22089: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-b6c1-0de4-000000000111] 34350 1726853752.22091: sending task result for task 02083763-bbaf-b6c1-0de4-000000000111 skipping: [managed_node1] => { "false_condition": "ansible_distribution_major_version == '7'" } 34350 1726853752.22185: no more pending results, returning what we have 34350 1726853752.22188: results queue empty 34350 1726853752.22189: checking for any_errors_fatal 34350 1726853752.22195: done checking for any_errors_fatal 34350 1726853752.22196: checking for max_fail_percentage 34350 1726853752.22197: done checking for max_fail_percentage 34350 1726853752.22198: checking to see if all hosts have failed and the running result is not ok 34350 1726853752.22199: done checking to see if all hosts have failed 34350 1726853752.22199: getting the remaining hosts for this loop 34350 1726853752.22200: done getting the remaining hosts for this loop 34350 1726853752.22203: getting the next task for host managed_node1 34350 1726853752.22209: done getting next task for host managed_node1 34350 1726853752.22212: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 34350 1726853752.22216: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853752.22231: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000111 34350 1726853752.22244: getting variables 34350 1726853752.22245: in VariableManager get_vars() 34350 1726853752.22284: Calling all_inventory to load vars for managed_node1 34350 1726853752.22287: Calling groups_inventory to load vars for managed_node1 34350 1726853752.22289: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853752.22299: Calling all_plugins_play to load vars for managed_node1 34350 1726853752.22302: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853752.22306: Calling groups_plugins_play to load vars for managed_node1 34350 1726853752.22578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853752.22786: done with get_vars() 34350 1726853752.22795: done getting variables 34350 1726853752.22838: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:35:52 -0400 (0:00:00.023) 0:00:06.102 ****** 34350 1726853752.22985: entering _queue_task() for managed_node1/ping 34350 1726853752.23446: worker is 1 (out of 1 available) 34350 1726853752.23458: exiting _queue_task() for managed_node1/ping 34350 1726853752.23470: done queuing things up, now waiting for results queue to drain 34350 1726853752.23473: waiting for pending results... 34350 1726853752.24188: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 34350 1726853752.24207: in run() - task 02083763-bbaf-b6c1-0de4-000000000112 34350 1726853752.24267: variable 'ansible_search_path' from source: unknown 34350 1726853752.24676: variable 'ansible_search_path' from source: unknown 34350 1726853752.24681: calling self._execute() 34350 1726853752.24684: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853752.24688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853752.24690: variable 'omit' from source: magic vars 34350 1726853752.25438: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.25457: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853752.25793: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.25805: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853752.25814: when evaluation is False, skipping this task 34350 1726853752.25822: _execute() done 34350 1726853752.25831: dumping result to json 34350 1726853752.25841: done dumping result, returning 34350 1726853752.25852: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-b6c1-0de4-000000000112] 34350 1726853752.25866: sending task result for task 02083763-bbaf-b6c1-0de4-000000000112 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853752.26017: no more pending results, returning what we have 34350 1726853752.26021: results queue empty 34350 1726853752.26022: checking for any_errors_fatal 34350 1726853752.26028: done checking for any_errors_fatal 34350 1726853752.26029: checking for max_fail_percentage 34350 1726853752.26030: done checking for max_fail_percentage 34350 1726853752.26031: checking to see if all hosts have failed and the running result is not ok 34350 1726853752.26032: done checking to see if all hosts have failed 34350 1726853752.26032: getting the remaining hosts for this loop 34350 1726853752.26034: done getting the remaining hosts for this loop 34350 1726853752.26037: getting the next task for host managed_node1 34350 1726853752.26046: done getting next task for host managed_node1 34350 1726853752.26048: ^ task is: TASK: meta (role_complete) 34350 1726853752.26051: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853752.26075: getting variables 34350 1726853752.26077: in VariableManager get_vars() 34350 1726853752.26128: Calling all_inventory to load vars for managed_node1 34350 1726853752.26131: Calling groups_inventory to load vars for managed_node1 34350 1726853752.26133: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853752.26145: Calling all_plugins_play to load vars for managed_node1 34350 1726853752.26147: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853752.26150: Calling groups_plugins_play to load vars for managed_node1 34350 1726853752.26727: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000112 34350 1726853752.26732: WORKER PROCESS EXITING 34350 1726853752.26852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853752.27286: done with get_vars() 34350 1726853752.27299: done getting variables 34350 1726853752.27378: done queuing things up, now waiting for results queue to drain 34350 1726853752.27476: results queue empty 34350 1726853752.27477: checking for any_errors_fatal 34350 1726853752.27480: done checking for any_errors_fatal 34350 1726853752.27481: checking for max_fail_percentage 34350 1726853752.27482: done checking for max_fail_percentage 34350 1726853752.27482: checking to see if all hosts have failed and the running result is not ok 34350 1726853752.27483: done checking to see if all hosts have failed 34350 1726853752.27484: getting the remaining hosts for this loop 34350 1726853752.27485: done getting the remaining hosts for this loop 34350 1726853752.27601: getting the next task for host managed_node1 34350 1726853752.27607: done getting next task for host managed_node1 34350 1726853752.27609: ^ task is: TASK: Include the task 'cleanup_mock_wifi.yml' 34350 1726853752.27611: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853752.27614: getting variables 34350 1726853752.27615: in VariableManager get_vars() 34350 1726853752.27632: Calling all_inventory to load vars for managed_node1 34350 1726853752.27635: Calling groups_inventory to load vars for managed_node1 34350 1726853752.27637: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853752.27642: Calling all_plugins_play to load vars for managed_node1 34350 1726853752.27644: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853752.27647: Calling groups_plugins_play to load vars for managed_node1 34350 1726853752.27865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853752.28450: done with get_vars() 34350 1726853752.28458: done getting variables TASK [Include the task 'cleanup_mock_wifi.yml'] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:96 Friday 20 September 2024 13:35:52 -0400 (0:00:00.055) 0:00:06.158 ****** 34350 1726853752.28532: entering _queue_task() for managed_node1/include_tasks 34350 1726853752.29249: worker is 1 (out of 1 available) 34350 1726853752.29262: exiting _queue_task() for managed_node1/include_tasks 34350 1726853752.29339: done queuing things up, now waiting for results queue to drain 34350 1726853752.29341: waiting for pending results... 34350 1726853752.29753: running TaskExecutor() for managed_node1/TASK: Include the task 'cleanup_mock_wifi.yml' 34350 1726853752.29872: in run() - task 02083763-bbaf-b6c1-0de4-000000000142 34350 1726853752.30277: variable 'ansible_search_path' from source: unknown 34350 1726853752.30281: calling self._execute() 34350 1726853752.30289: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853752.30293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853752.30296: variable 'omit' from source: magic vars 34350 1726853752.31061: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.31291: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853752.31403: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.31677: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853752.31680: when evaluation is False, skipping this task 34350 1726853752.31684: _execute() done 34350 1726853752.31686: dumping result to json 34350 1726853752.31688: done dumping result, returning 34350 1726853752.31690: done running TaskExecutor() for managed_node1/TASK: Include the task 'cleanup_mock_wifi.yml' [02083763-bbaf-b6c1-0de4-000000000142] 34350 1726853752.31692: sending task result for task 02083763-bbaf-b6c1-0de4-000000000142 34350 1726853752.31769: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000142 34350 1726853752.31775: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853752.31825: no more pending results, returning what we have 34350 1726853752.31829: results queue empty 34350 1726853752.31829: checking for any_errors_fatal 34350 1726853752.31831: done checking for any_errors_fatal 34350 1726853752.31831: checking for max_fail_percentage 34350 1726853752.31833: done checking for max_fail_percentage 34350 1726853752.31833: checking to see if all hosts have failed and the running result is not ok 34350 1726853752.31834: done checking to see if all hosts have failed 34350 1726853752.31835: getting the remaining hosts for this loop 34350 1726853752.31837: done getting the remaining hosts for this loop 34350 1726853752.31841: getting the next task for host managed_node1 34350 1726853752.31849: done getting next task for host managed_node1 34350 1726853752.31851: ^ task is: TASK: Verify network state restored to default 34350 1726853752.31854: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 34350 1726853752.31858: getting variables 34350 1726853752.31859: in VariableManager get_vars() 34350 1726853752.31909: Calling all_inventory to load vars for managed_node1 34350 1726853752.31912: Calling groups_inventory to load vars for managed_node1 34350 1726853752.31914: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853752.31926: Calling all_plugins_play to load vars for managed_node1 34350 1726853752.31928: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853752.31931: Calling groups_plugins_play to load vars for managed_node1 34350 1726853752.32338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853752.32808: done with get_vars() 34350 1726853752.32821: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:98 Friday 20 September 2024 13:35:52 -0400 (0:00:00.044) 0:00:06.203 ****** 34350 1726853752.33030: entering _queue_task() for managed_node1/include_tasks 34350 1726853752.33768: worker is 1 (out of 1 available) 34350 1726853752.33786: exiting _queue_task() for managed_node1/include_tasks 34350 1726853752.33798: done queuing things up, now waiting for results queue to drain 34350 1726853752.33799: waiting for pending results... 34350 1726853752.34132: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 34350 1726853752.34245: in run() - task 02083763-bbaf-b6c1-0de4-000000000143 34350 1726853752.34491: variable 'ansible_search_path' from source: unknown 34350 1726853752.34676: calling self._execute() 34350 1726853752.34680: variable 'ansible_host' from source: host vars for 'managed_node1' 34350 1726853752.34682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 34350 1726853752.34685: variable 'omit' from source: magic vars 34350 1726853752.35384: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.35401: Evaluated conditional (ansible_distribution_major_version != '6'): True 34350 1726853752.35693: variable 'ansible_distribution_major_version' from source: facts 34350 1726853752.35704: Evaluated conditional (ansible_distribution_major_version == '7'): False 34350 1726853752.35711: when evaluation is False, skipping this task 34350 1726853752.35717: _execute() done 34350 1726853752.35723: dumping result to json 34350 1726853752.35730: done dumping result, returning 34350 1726853752.35741: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [02083763-bbaf-b6c1-0de4-000000000143] 34350 1726853752.35750: sending task result for task 02083763-bbaf-b6c1-0de4-000000000143 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '7'", "skip_reason": "Conditional result was False" } 34350 1726853752.35911: no more pending results, returning what we have 34350 1726853752.35915: results queue empty 34350 1726853752.35916: checking for any_errors_fatal 34350 1726853752.35923: done checking for any_errors_fatal 34350 1726853752.35924: checking for max_fail_percentage 34350 1726853752.35927: done checking for max_fail_percentage 34350 1726853752.35928: checking to see if all hosts have failed and the running result is not ok 34350 1726853752.35928: done checking to see if all hosts have failed 34350 1726853752.35929: getting the remaining hosts for this loop 34350 1726853752.35931: done getting the remaining hosts for this loop 34350 1726853752.35936: getting the next task for host managed_node1 34350 1726853752.35947: done getting next task for host managed_node1 34350 1726853752.35949: ^ task is: TASK: meta (flush_handlers) 34350 1726853752.35952: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853752.35958: getting variables 34350 1726853752.35960: in VariableManager get_vars() 34350 1726853752.36032: Calling all_inventory to load vars for managed_node1 34350 1726853752.36036: Calling groups_inventory to load vars for managed_node1 34350 1726853752.36039: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853752.36176: Calling all_plugins_play to load vars for managed_node1 34350 1726853752.36180: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853752.36184: Calling groups_plugins_play to load vars for managed_node1 34350 1726853752.37041: done sending task result for task 02083763-bbaf-b6c1-0de4-000000000143 34350 1726853752.37046: WORKER PROCESS EXITING 34350 1726853752.37178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853752.37433: done with get_vars() 34350 1726853752.37445: done getting variables 34350 1726853752.37584: in VariableManager get_vars() 34350 1726853752.37602: Calling all_inventory to load vars for managed_node1 34350 1726853752.37606: Calling groups_inventory to load vars for managed_node1 34350 1726853752.37611: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853752.37619: Calling all_plugins_play to load vars for managed_node1 34350 1726853752.37621: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853752.37624: Calling groups_plugins_play to load vars for managed_node1 34350 1726853752.37765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853752.37977: done with get_vars() 34350 1726853752.37991: done queuing things up, now waiting for results queue to drain 34350 1726853752.37993: results queue empty 34350 1726853752.37994: checking for any_errors_fatal 34350 1726853752.37996: done checking for any_errors_fatal 34350 1726853752.37997: checking for max_fail_percentage 34350 1726853752.37998: done checking for max_fail_percentage 34350 1726853752.37999: checking to see if all hosts have failed and the running result is not ok 34350 1726853752.37999: done checking to see if all hosts have failed 34350 1726853752.38000: getting the remaining hosts for this loop 34350 1726853752.38001: done getting the remaining hosts for this loop 34350 1726853752.38004: getting the next task for host managed_node1 34350 1726853752.38007: done getting next task for host managed_node1 34350 1726853752.38009: ^ task is: TASK: meta (flush_handlers) 34350 1726853752.38010: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853752.38013: getting variables 34350 1726853752.38014: in VariableManager get_vars() 34350 1726853752.38031: Calling all_inventory to load vars for managed_node1 34350 1726853752.38033: Calling groups_inventory to load vars for managed_node1 34350 1726853752.38035: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853752.38045: Calling all_plugins_play to load vars for managed_node1 34350 1726853752.38048: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853752.38051: Calling groups_plugins_play to load vars for managed_node1 34350 1726853752.38193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853752.38429: done with get_vars() 34350 1726853752.38438: done getting variables 34350 1726853752.38495: in VariableManager get_vars() 34350 1726853752.38513: Calling all_inventory to load vars for managed_node1 34350 1726853752.38515: Calling groups_inventory to load vars for managed_node1 34350 1726853752.38517: Calling all_plugins_inventory to load vars for managed_node1 34350 1726853752.38521: Calling all_plugins_play to load vars for managed_node1 34350 1726853752.38523: Calling groups_plugins_inventory to load vars for managed_node1 34350 1726853752.38526: Calling groups_plugins_play to load vars for managed_node1 34350 1726853752.38673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 34350 1726853752.38887: done with get_vars() 34350 1726853752.38900: done queuing things up, now waiting for results queue to drain 34350 1726853752.38902: results queue empty 34350 1726853752.38903: checking for any_errors_fatal 34350 1726853752.38905: done checking for any_errors_fatal 34350 1726853752.38905: checking for max_fail_percentage 34350 1726853752.38906: done checking for max_fail_percentage 34350 1726853752.38907: checking to see if all hosts have failed and the running result is not ok 34350 1726853752.38908: done checking to see if all hosts have failed 34350 1726853752.38909: getting the remaining hosts for this loop 34350 1726853752.38910: done getting the remaining hosts for this loop 34350 1726853752.38929: getting the next task for host managed_node1 34350 1726853752.38932: done getting next task for host managed_node1 34350 1726853752.38933: ^ task is: None 34350 1726853752.38934: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 34350 1726853752.38936: done queuing things up, now waiting for results queue to drain 34350 1726853752.38937: results queue empty 34350 1726853752.38937: checking for any_errors_fatal 34350 1726853752.38938: done checking for any_errors_fatal 34350 1726853752.38939: checking for max_fail_percentage 34350 1726853752.38940: done checking for max_fail_percentage 34350 1726853752.38941: checking to see if all hosts have failed and the running result is not ok 34350 1726853752.38941: done checking to see if all hosts have failed 34350 1726853752.38943: getting the next task for host managed_node1 34350 1726853752.38946: done getting next task for host managed_node1 34350 1726853752.38946: ^ task is: None 34350 1726853752.38947: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=7 changed=0 unreachable=0 failed=0 skipped=102 rescued=0 ignored=0 Friday 20 September 2024 13:35:52 -0400 (0:00:00.059) 0:00:06.262 ****** =============================================================================== Gathering Facts --------------------------------------------------------- 1.47s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_wireless_nm.yml:6 Gather the minimum subset of ansible_facts required by the network role test --- 0.74s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.72s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Include the task 'enable_epel.yml' -------------------------------------- 0.14s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Verify network state restored to default -------------------------------- 0.06s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_wireless.yml:98 Create EPEL 10 ---------------------------------------------------------- 0.06s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.06s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.06s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable --- 0.06s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces --- 0.05s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 fedora.linux_system_roles.network : Print network provider -------------- 0.05s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 fedora.linux_system_roles.network : Show debug messages for the network_state --- 0.05s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces --- 0.05s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Install yum-utils package ----------------------------------------------- 0.05s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 --- 0.05s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later --- 0.05s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 fedora.linux_system_roles.network : Enable and start wpa_supplicant ----- 0.05s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.05s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable --- 0.05s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 fedora.linux_system_roles.network : Show debug messages for the network_connections --- 0.05s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 34350 1726853752.39053: RUNNING CLEANUP